2026-03-08T22:40:57.525 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-08T22:40:57.535 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T22:40:57.554 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/283 branch: squid description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/scrub} email: null first_in_suite: false flavor: default job_id: '283' ktype: distro last_in_suite: false machine_type: vps name: kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps no_nested_subset: false openstack: - volumes: count: 3 size: 10 os_type: ubuntu os_version: '22.04' overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: conf: mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 osd: debug ms: 1 debug osd: 20 osd mclock iops capacity threshold hdd: 49000 flavor: default log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath workunit: branch: tt-squid sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - mon.a - mgr.x - osd.0 - osd.1 - osd.2 - client.0 seed: 5909 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 suite: rados:standalone suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 569c3e99c9b32a51b4eaf08731c728f4513ed589 targets: vm03.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJq5HqnngSnwEAp3p2u5pFg199KzWIpDvmLyUpGfbUmOl4pApt1+vd/FP11lotDkRKmpF5LVo8yDzC2VwwCgVBA= tasks: - install: null - workunit: basedir: qa/standalone clients: all: - scrub teuthology: fragments_dropped: [] meta: {} postmerge: [] teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-08_21:49:43 tube: vps user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 2026-03-08T22:40:57.554 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa; will attempt to use it 2026-03-08T22:40:57.555 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_569c3e99c9b32a51b4eaf08731c728f4513ed589/qa/tasks 2026-03-08T22:40:57.555 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-08T22:40:57.555 INFO:teuthology.task.internal:Checking packages... 2026-03-08T22:40:57.555 INFO:teuthology.task.internal:Checking packages for os_type 'ubuntu', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-08T22:40:57.555 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-08T22:40:57.555 INFO:teuthology.packaging:ref: None 2026-03-08T22:40:57.555 INFO:teuthology.packaging:tag: None 2026-03-08T22:40:57.555 INFO:teuthology.packaging:branch: squid 2026-03-08T22:40:57.555 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:40:57.555 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&ref=squid 2026-03-08T22:40:58.149 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:40:58.150 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-08T22:40:58.150 INFO:teuthology.task.internal:no buildpackages task found 2026-03-08T22:40:58.150 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-08T22:40:58.151 INFO:teuthology.task.internal:Saving configuration 2026-03-08T22:40:58.154 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-08T22:40:58.155 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-08T22:40:58.162 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm03.local', 'description': '/archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/283', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'ubuntu', 'os_version': '22.04', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-08 22:40:14.659021', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:03', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJq5HqnngSnwEAp3p2u5pFg199KzWIpDvmLyUpGfbUmOl4pApt1+vd/FP11lotDkRKmpF5LVo8yDzC2VwwCgVBA='} 2026-03-08T22:40:58.162 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-08T22:40:58.163 INFO:teuthology.task.internal:roles: ubuntu@vm03.local - ['mon.a', 'mgr.x', 'osd.0', 'osd.1', 'osd.2', 'client.0'] 2026-03-08T22:40:58.163 INFO:teuthology.run_tasks:Running task console_log... 2026-03-08T22:40:58.170 DEBUG:teuthology.task.console_log:vm03 does not support IPMI; excluding 2026-03-08T22:40:58.171 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f1ab7bd92d0>, signals=[15]) 2026-03-08T22:40:58.171 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-08T22:40:58.171 INFO:teuthology.task.internal:Opening connections... 2026-03-08T22:40:58.171 DEBUG:teuthology.task.internal:connecting to ubuntu@vm03.local 2026-03-08T22:40:58.172 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:40:58.229 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-08T22:40:58.231 DEBUG:teuthology.orchestra.run.vm03:> uname -m 2026-03-08T22:40:58.349 INFO:teuthology.orchestra.run.vm03.stdout:x86_64 2026-03-08T22:40:58.349 DEBUG:teuthology.orchestra.run.vm03:> cat /etc/os-release 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:PRETTY_NAME="Ubuntu 22.04.5 LTS" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:NAME="Ubuntu" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_ID="22.04" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:VERSION="22.04.5 LTS (Jammy Jellyfish)" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:VERSION_CODENAME=jammy 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:ID=ubuntu 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:ID_LIKE=debian 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:HOME_URL="https://www.ubuntu.com/" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:SUPPORT_URL="https://help.ubuntu.com/" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" 2026-03-08T22:40:58.393 INFO:teuthology.orchestra.run.vm03.stdout:UBUNTU_CODENAME=jammy 2026-03-08T22:40:58.394 INFO:teuthology.lock.ops:Updating vm03.local on lock server 2026-03-08T22:40:58.403 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-08T22:40:58.404 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-08T22:40:58.405 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-08T22:40:58.405 DEBUG:teuthology.orchestra.run.vm03:> test '!' -e /home/ubuntu/cephtest 2026-03-08T22:40:58.437 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-08T22:40:58.439 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-08T22:40:58.439 DEBUG:teuthology.orchestra.run.vm03:> test -z $(ls -A /var/lib/ceph) 2026-03-08T22:40:58.481 INFO:teuthology.orchestra.run.vm03.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-08T22:40:58.481 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-08T22:40:58.490 DEBUG:teuthology.orchestra.run.vm03:> test -e /ceph-qa-ready 2026-03-08T22:40:58.526 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:40:58.888 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-08T22:40:58.890 INFO:teuthology.task.internal:Creating test directory... 2026-03-08T22:40:58.890 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-08T22:40:58.894 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-08T22:40:58.898 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-08T22:40:58.905 INFO:teuthology.task.internal:Creating archive directory... 2026-03-08T22:40:58.905 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-08T22:40:58.938 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-08T22:40:58.939 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-08T22:40:58.939 DEBUG:teuthology.orchestra.run.vm03:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-08T22:40:58.980 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:40:58.980 DEBUG:teuthology.orchestra.run.vm03:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-08T22:40:59.029 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:59.034 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:40:59.035 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-08T22:40:59.049 INFO:teuthology.task.internal:Configuring sudo... 2026-03-08T22:40:59.049 DEBUG:teuthology.orchestra.run.vm03:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-08T22:40:59.085 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-08T22:40:59.087 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-08T22:40:59.087 DEBUG:teuthology.orchestra.run.vm03:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-08T22:40:59.129 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T22:40:59.173 DEBUG:teuthology.orchestra.run.vm03:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T22:40:59.217 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:40:59.217 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-08T22:40:59.266 DEBUG:teuthology.orchestra.run.vm03:> sudo service rsyslog restart 2026-03-08T22:40:59.320 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-08T22:40:59.322 INFO:teuthology.task.internal:Starting timer... 2026-03-08T22:40:59.322 INFO:teuthology.run_tasks:Running task pcp... 2026-03-08T22:40:59.337 INFO:teuthology.run_tasks:Running task selinux... 2026-03-08T22:40:59.339 INFO:teuthology.task.selinux:Excluding vm03: VMs are not yet supported 2026-03-08T22:40:59.339 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-08T22:40:59.339 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-08T22:40:59.339 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-08T22:40:59.339 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-08T22:40:59.345 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-08T22:40:59.345 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-08T22:40:59.347 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-08T22:40:59.930 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-08T22:40:59.935 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-08T22:40:59.936 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventory021ckzpq --limit vm03.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-08T22:43:28.273 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm03.local')] 2026-03-08T22:43:28.274 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm03.local' 2026-03-08T22:43:28.275 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm03.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-08T22:43:28.340 DEBUG:teuthology.orchestra.run.vm03:> true 2026-03-08T22:43:28.561 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm03.local' 2026-03-08T22:43:28.561 INFO:teuthology.run_tasks:Running task clock... 2026-03-08T22:43:28.564 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-08T22:43:28.564 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-08T22:43:28.564 DEBUG:teuthology.orchestra.run.vm03:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: ntpd 4.2.8p15@1.3728-o Wed Feb 16 17:13:02 UTC 2022 (1): Starting 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Command line: ntpd -gq 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: ---------------------------------------------------- 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: ntp-4 is maintained by Network Time Foundation, 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Inc. (NTF), a non-profit 501(c)(3) public-benefit 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: corporation. Support and training for ntp-4 are 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: available at https://www.nwtime.org/support 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: ---------------------------------------------------- 2026-03-08T22:43:28.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: proto: precision = 0.029 usec (-25) 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: basedate set to 2022-02-04 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: gps base set to 2022-02-06 (week 2196) 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): good hash signature 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): loaded, expire=2025-12-28T00:00:00Z last=2017-01-01T00:00:00Z ofs=37 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stderr: 8 Mar 22:43:28 ntpd[15993]: leapsecond file ('/usr/share/zoneinfo/leap-seconds.list'): expired 71 days ago 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen and drop on 0 v6wildcard [::]:123 2026-03-08T22:43:28.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen and drop on 1 v4wildcard 0.0.0.0:123 2026-03-08T22:43:28.621 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen normally on 2 lo 127.0.0.1:123 2026-03-08T22:43:28.621 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen normally on 3 ens3 192.168.123.103:123 2026-03-08T22:43:28.621 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen normally on 4 lo [::1]:123 2026-03-08T22:43:28.621 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listen normally on 5 ens3 [fe80::5055:ff:fe00:3%2]:123 2026-03-08T22:43:28.621 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:28 ntpd[15993]: Listening on routing socket on fd #22 for interface updates 2026-03-08T22:43:29.620 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:29 ntpd[15993]: Soliciting pool server 213.202.247.29 2026-03-08T22:43:30.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:30 ntpd[15993]: Soliciting pool server 194.36.144.87 2026-03-08T22:43:30.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:30 ntpd[15993]: Soliciting pool server 85.215.166.214 2026-03-08T22:43:31.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:31 ntpd[15993]: Soliciting pool server 217.115.11.162 2026-03-08T22:43:31.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:31 ntpd[15993]: Soliciting pool server 78.47.94.77 2026-03-08T22:43:31.619 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:31 ntpd[15993]: Soliciting pool server 213.239.234.28 2026-03-08T22:43:32.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:32 ntpd[15993]: Soliciting pool server 185.228.138.224 2026-03-08T22:43:32.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:32 ntpd[15993]: Soliciting pool server 49.12.125.53 2026-03-08T22:43:32.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:32 ntpd[15993]: Soliciting pool server 82.165.178.31 2026-03-08T22:43:32.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:32 ntpd[15993]: Soliciting pool server 90.187.112.137 2026-03-08T22:43:33.617 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:33 ntpd[15993]: Soliciting pool server 81.169.217.236 2026-03-08T22:43:33.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:33 ntpd[15993]: Soliciting pool server 172.104.134.72 2026-03-08T22:43:33.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:33 ntpd[15993]: Soliciting pool server 185.252.140.126 2026-03-08T22:43:33.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:33 ntpd[15993]: Soliciting pool server 185.125.190.56 2026-03-08T22:43:33.618 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:33 ntpd[15993]: Soliciting pool server 148.251.5.46 2026-03-08T22:43:34.617 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:34 ntpd[15993]: Soliciting pool server 185.125.190.57 2026-03-08T22:43:34.617 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:34 ntpd[15993]: Soliciting pool server 148.251.54.81 2026-03-08T22:43:34.617 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:34 ntpd[15993]: Soliciting pool server 134.60.1.27 2026-03-08T22:43:34.617 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:34 ntpd[15993]: Soliciting pool server 89.58.42.129 2026-03-08T22:43:37.643 INFO:teuthology.orchestra.run.vm03.stdout: 8 Mar 22:43:37 ntpd[15993]: ntpd: time slew +0.001400 s 2026-03-08T22:43:37.644 INFO:teuthology.orchestra.run.vm03.stdout:ntpd: time slew +0.001400s 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================== 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:43:37.665 INFO:teuthology.orchestra.run.vm03.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T22:43:37.665 INFO:teuthology.run_tasks:Running task install... 2026-03-08T22:43:37.704 DEBUG:teuthology.task.install:project ceph 2026-03-08T22:43:37.705 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:43:37.705 DEBUG:teuthology.task.install:config {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-08T22:43:37.705 INFO:teuthology.task.install:Using flavor: default 2026-03-08T22:43:37.707 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'ceph-volume', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'ceph-volume', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-08T22:43:37.707 INFO:teuthology.task.install:extra packages: [] 2026-03-08T22:43:37.707 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-key list | grep Ceph 2026-03-08T22:43:37.754 INFO:teuthology.orchestra.run.vm03.stderr:Warning: apt-key is deprecated. Manage keyring files in trusted.gpg.d instead (see apt-key(8)). 2026-03-08T22:43:37.775 INFO:teuthology.orchestra.run.vm03.stdout:uid [ unknown] Ceph automated package build (Ceph automated package build) 2026-03-08T22:43:37.775 INFO:teuthology.orchestra.run.vm03.stdout:uid [ unknown] Ceph.com (release key) 2026-03-08T22:43:37.775 INFO:teuthology.task.install.deb:Installing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on remote deb x86_64 2026-03-08T22:43:37.775 INFO:teuthology.task.install.deb:Installing system (non-project) packages: python3-xmltodict, python3-jmespath on remote deb x86_64 2026-03-08T22:43:37.775 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:43:38.365 INFO:teuthology.task.install.deb:Pulling from https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default/ 2026-03-08T22:43:38.365 INFO:teuthology.task.install.deb:Package version is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:43:38.912 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:43:38.912 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/etc/apt/sources.list.d/ceph.list 2026-03-08T22:43:38.922 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-get update 2026-03-08T22:43:39.116 INFO:teuthology.orchestra.run.vm03.stdout:Hit:1 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T22:43:39.123 INFO:teuthology.orchestra.run.vm03.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T22:43:39.129 INFO:teuthology.orchestra.run.vm03.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T22:43:39.468 INFO:teuthology.orchestra.run.vm03.stdout:Hit:4 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T22:43:39.504 INFO:teuthology.orchestra.run.vm03.stdout:Ign:5 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy InRelease 2026-03-08T22:43:39.624 INFO:teuthology.orchestra.run.vm03.stdout:Get:6 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release [7662 B] 2026-03-08T22:43:39.744 INFO:teuthology.orchestra.run.vm03.stdout:Ign:7 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy Release.gpg 2026-03-08T22:43:39.864 INFO:teuthology.orchestra.run.vm03.stdout:Get:8 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 Packages [18.1 kB] 2026-03-08T22:43:39.946 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 25.8 kB in 1s (29.8 kB/s) 2026-03-08T22:43:40.684 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T22:43:40.697 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install ceph=19.2.3-678-ge911bdeb-1jammy cephadm=19.2.3-678-ge911bdeb-1jammy ceph-mds=19.2.3-678-ge911bdeb-1jammy ceph-mgr=19.2.3-678-ge911bdeb-1jammy ceph-common=19.2.3-678-ge911bdeb-1jammy ceph-fuse=19.2.3-678-ge911bdeb-1jammy ceph-test=19.2.3-678-ge911bdeb-1jammy ceph-volume=19.2.3-678-ge911bdeb-1jammy radosgw=19.2.3-678-ge911bdeb-1jammy python3-rados=19.2.3-678-ge911bdeb-1jammy python3-rgw=19.2.3-678-ge911bdeb-1jammy python3-cephfs=19.2.3-678-ge911bdeb-1jammy python3-rbd=19.2.3-678-ge911bdeb-1jammy libcephfs2=19.2.3-678-ge911bdeb-1jammy libcephfs-dev=19.2.3-678-ge911bdeb-1jammy librados2=19.2.3-678-ge911bdeb-1jammy librbd1=19.2.3-678-ge911bdeb-1jammy rbd-fuse=19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:43:40.732 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T22:43:40.955 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T22:43:40.956 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T22:43:41.229 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:43:41.230 INFO:teuthology.orchestra.run.vm03.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:43:41.231 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:43:41.231 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:43:41.232 INFO:teuthology.orchestra.run.vm03.stdout:The following additional packages will be installed: 2026-03-08T22:43:41.232 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base ceph-mgr-cephadm ceph-mgr-dashboard ceph-mgr-diskprediction-local 2026-03-08T22:43:41.232 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-k8sevents ceph-mgr-modules-core ceph-mon ceph-osd jq 2026-03-08T22:43:41.232 INFO:teuthology.orchestra.run.vm03.stdout: libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T22:43:41.232 INFO:teuthology.orchestra.run.vm03.stdout: liboath0 libonig5 libpcre2-16-0 libqt5core5a libqt5dbus5 libqt5network5 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsqlite3-mod-ceph 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: libthrift-0.16.0 lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:43:41.233 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-pytest python3-repoze.lru 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-toml python3-waitress python3-wcwidth python3-webob 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: python3-websocket python3-webtest python3-werkzeug python3-zc.lockfile 2026-03-08T22:43:41.234 INFO:teuthology.orchestra.run.vm03.stdout: qttranslations5-l10n smartmontools socat unzip xmlstarlet zip 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout:Suggested packages: 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: python3-influxdb readline-doc python3-beaker python-mako-doc 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: python-natsort-doc httpd-wsgi libapache2-mod-python libapache2-mod-scgi 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: libjs-mochikit python-pecan-doc python-psutil-doc subversion 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: python-pygments-doc ttf-bitstream-vera python-pyinotify-doc python3-dap 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: python-sklearn-doc ipython3 python-waitress-doc python-webob-doc 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: python-webtest-doc python-werkzeug-doc python3-watchdog gsmartcontrol 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: smart-notifier mailx | mailutils 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout:Recommended packages: 2026-03-08T22:43:41.235 INFO:teuthology.orchestra.run.vm03.stdout: btrfs-tools 2026-03-08T22:43:41.280 INFO:teuthology.orchestra.run.vm03.stdout:The following NEW packages will be installed: 2026-03-08T22:43:41.280 INFO:teuthology.orchestra.run.vm03.stdout: ceph ceph-base ceph-common ceph-fuse ceph-mds ceph-mgr ceph-mgr-cephadm 2026-03-08T22:43:41.280 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-k8sevents 2026-03-08T22:43:41.280 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core ceph-mon ceph-osd ceph-test ceph-volume cephadm jq 2026-03-08T22:43:41.280 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-dev libcephfs2 libdouble-conversion3 libfuse2 libjq1 liblttng-ust1 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: liblua5.3-dev libnbd0 liboath0 libonig5 libpcre2-16-0 libqt5core5a 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: libqt5dbus5 libqt5network5 libradosstriper1 librdkafka1 libreadline-dev 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: librgw2 libsqlite3-mod-ceph libthrift-0.16.0 lua-any lua-sec lua-socket 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: lua5.1 luarocks nvme-cli pkg-config python-asyncssh-doc 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-argparse python3-ceph-common python3-cephfs python3-cheroot 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-iniconfig 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T22:43:41.281 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-pluggy python3-portend 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-prettytable python3-psutil python3-py python3-pygments 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-pytest python3-rados python3-rbd 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze.lru python3-requests-oauthlib python3-rgw python3-routes 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-rsa python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-threadpoolctl python3-toml python3-waitress python3-wcwidth 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile qttranslations5-l10n radosgw rbd-fuse smartmontools 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout: socat unzip xmlstarlet zip 2026-03-08T22:43:41.282 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be upgraded: 2026-03-08T22:43:41.283 INFO:teuthology.orchestra.run.vm03.stdout: librados2 librbd1 2026-03-08T22:43:41.382 INFO:teuthology.orchestra.run.vm03.stdout:2 upgraded, 107 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:43:41.383 INFO:teuthology.orchestra.run.vm03.stdout:Need to get 178 MB of archives. 2026-03-08T22:43:41.383 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 782 MB of additional disk space will be used. 2026-03-08T22:43:41.383 INFO:teuthology.orchestra.run.vm03.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblttng-ust1 amd64 2.13.1-1ubuntu1 [190 kB] 2026-03-08T22:43:41.415 INFO:teuthology.orchestra.run.vm03.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libdouble-conversion3 amd64 3.1.7-4 [39.0 kB] 2026-03-08T22:43:41.415 INFO:teuthology.orchestra.run.vm03.stdout:Get:3 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libpcre2-16-0 amd64 10.39-3ubuntu0.1 [203 kB] 2026-03-08T22:43:41.422 INFO:teuthology.orchestra.run.vm03.stdout:Get:4 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5core5a amd64 5.15.3+dfsg-2ubuntu0.2 [2006 kB] 2026-03-08T22:43:41.447 INFO:teuthology.orchestra.run.vm03.stdout:Get:5 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5dbus5 amd64 5.15.3+dfsg-2ubuntu0.2 [222 kB] 2026-03-08T22:43:41.448 INFO:teuthology.orchestra.run.vm03.stdout:Get:6 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 libqt5network5 amd64 5.15.3+dfsg-2ubuntu0.2 [731 kB] 2026-03-08T22:43:41.455 INFO:teuthology.orchestra.run.vm03.stdout:Get:7 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libthrift-0.16.0 amd64 0.16.0-2 [267 kB] 2026-03-08T22:43:41.464 INFO:teuthology.orchestra.run.vm03.stdout:Get:8 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libnbd0 amd64 1.10.5-1 [71.3 kB] 2026-03-08T22:43:41.464 INFO:teuthology.orchestra.run.vm03.stdout:Get:9 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-wcwidth all 0.2.5+dfsg1-1 [21.9 kB] 2026-03-08T22:43:41.465 INFO:teuthology.orchestra.run.vm03.stdout:Get:10 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-prettytable all 2.5.0-2 [31.3 kB] 2026-03-08T22:43:41.465 INFO:teuthology.orchestra.run.vm03.stdout:Get:11 https://archive.ubuntu.com/ubuntu jammy/universe amd64 librdkafka1 amd64 1.8.0-1build1 [633 kB] 2026-03-08T22:43:41.468 INFO:teuthology.orchestra.run.vm03.stdout:Get:12 https://archive.ubuntu.com/ubuntu jammy/main amd64 libreadline-dev amd64 8.1.2-1 [166 kB] 2026-03-08T22:43:41.470 INFO:teuthology.orchestra.run.vm03.stdout:Get:13 https://archive.ubuntu.com/ubuntu jammy/main amd64 liblua5.3-dev amd64 5.3.6-1build1 [167 kB] 2026-03-08T22:43:41.471 INFO:teuthology.orchestra.run.vm03.stdout:Get:14 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua5.1 amd64 5.1.5-8.1build4 [94.6 kB] 2026-03-08T22:43:41.471 INFO:teuthology.orchestra.run.vm03.stdout:Get:15 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-any all 27ubuntu1 [5034 B] 2026-03-08T22:43:41.472 INFO:teuthology.orchestra.run.vm03.stdout:Get:16 https://archive.ubuntu.com/ubuntu jammy/main amd64 zip amd64 3.0-12build2 [176 kB] 2026-03-08T22:43:41.475 INFO:teuthology.orchestra.run.vm03.stdout:Get:17 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 unzip amd64 6.0-26ubuntu3.2 [175 kB] 2026-03-08T22:43:41.477 INFO:teuthology.orchestra.run.vm03.stdout:Get:18 https://archive.ubuntu.com/ubuntu jammy/universe amd64 luarocks all 3.8.0+dfsg1-1 [140 kB] 2026-03-08T22:43:41.478 INFO:teuthology.orchestra.run.vm03.stdout:Get:19 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 liboath0 amd64 2.6.7-3ubuntu0.1 [41.3 kB] 2026-03-08T22:43:41.478 INFO:teuthology.orchestra.run.vm03.stdout:Get:20 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.functools all 3.4.0-2 [9030 B] 2026-03-08T22:43:41.483 INFO:teuthology.orchestra.run.vm03.stdout:Get:21 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-cheroot all 8.5.2+ds1-1ubuntu3.1 [71.1 kB] 2026-03-08T22:43:41.484 INFO:teuthology.orchestra.run.vm03.stdout:Get:22 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.classes all 3.2.1-3 [6452 B] 2026-03-08T22:43:41.484 INFO:teuthology.orchestra.run.vm03.stdout:Get:23 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.text all 3.6.0-2 [8716 B] 2026-03-08T22:43:41.484 INFO:teuthology.orchestra.run.vm03.stdout:Get:24 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jaraco.collections all 3.4.0-2 [11.4 kB] 2026-03-08T22:43:41.484 INFO:teuthology.orchestra.run.vm03.stdout:Get:25 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempora all 4.1.2-1 [14.8 kB] 2026-03-08T22:43:41.485 INFO:teuthology.orchestra.run.vm03.stdout:Get:26 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-portend all 3.0.0-1 [7240 B] 2026-03-08T22:43:41.491 INFO:teuthology.orchestra.run.vm03.stdout:Get:27 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-zc.lockfile all 2.0-1 [8980 B] 2026-03-08T22:43:41.491 INFO:teuthology.orchestra.run.vm03.stdout:Get:28 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cherrypy3 all 18.6.1-4 [208 kB] 2026-03-08T22:43:41.493 INFO:teuthology.orchestra.run.vm03.stdout:Get:29 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-natsort all 8.0.2-1 [35.3 kB] 2026-03-08T22:43:41.493 INFO:teuthology.orchestra.run.vm03.stdout:Get:30 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-logutils all 0.3.3-8 [17.6 kB] 2026-03-08T22:43:41.498 INFO:teuthology.orchestra.run.vm03.stdout:Get:31 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-mako all 1.1.3+ds1-2ubuntu0.1 [60.5 kB] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:32 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplegeneric all 0.8.1-3 [11.3 kB] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:33 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-singledispatch all 3.4.0.3-3 [7320 B] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:34 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-webob all 1:1.8.6-1.1ubuntu0.1 [86.7 kB] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:35 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-waitress all 1.4.4-1.1ubuntu1.1 [47.0 kB] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:36 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-tempita all 0.5.2-6ubuntu1 [15.1 kB] 2026-03-08T22:43:41.544 INFO:teuthology.orchestra.run.vm03.stdout:Get:37 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-paste all 3.5.0+dfsg1-1 [456 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:38 https://archive.ubuntu.com/ubuntu jammy/main amd64 python-pastedeploy-tpl all 2.1.1-1 [4892 B] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:39 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastedeploy all 2.1.1-1 [26.6 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:40 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-webtest all 2.0.35-1 [28.5 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:41 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pecan all 1.3.3-4ubuntu2 [87.3 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:42 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-werkzeug all 2.0.2+dfsg1-1ubuntu0.22.04.3 [181 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:43 https://archive.ubuntu.com/ubuntu jammy/universe amd64 libfuse2 amd64 2.9.9-5ubuntu3 [90.3 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:44 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python3-asyncssh all 2.5.0-1ubuntu0.1 [189 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:45 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-repoze.lru all 0.7-2 [12.1 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:46 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-routes all 2.5.1-1ubuntu1 [89.0 kB] 2026-03-08T22:43:41.545 INFO:teuthology.orchestra.run.vm03.stdout:Get:47 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn-lib amd64 0.23.2-5ubuntu6 [2058 kB] 2026-03-08T22:43:41.551 INFO:teuthology.orchestra.run.vm03.stdout:Get:48 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-joblib all 0.17.0-4ubuntu1 [204 kB] 2026-03-08T22:43:41.552 INFO:teuthology.orchestra.run.vm03.stdout:Get:49 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-threadpoolctl all 3.1.0-1 [21.3 kB] 2026-03-08T22:43:41.552 INFO:teuthology.orchestra.run.vm03.stdout:Get:50 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-sklearn all 0.23.2-5ubuntu6 [1829 kB] 2026-03-08T22:43:41.560 INFO:teuthology.orchestra.run.vm03.stdout:Get:51 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-cachetools all 5.0.0-1 [9722 B] 2026-03-08T22:43:41.561 INFO:teuthology.orchestra.run.vm03.stdout:Get:52 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-rsa all 4.8-1 [28.4 kB] 2026-03-08T22:43:41.561 INFO:teuthology.orchestra.run.vm03.stdout:Get:53 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-google-auth all 1.5.1-3 [35.7 kB] 2026-03-08T22:43:41.562 INFO:teuthology.orchestra.run.vm03.stdout:Get:54 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-requests-oauthlib all 1.3.0+ds-0.1 [18.7 kB] 2026-03-08T22:43:41.562 INFO:teuthology.orchestra.run.vm03.stdout:Get:55 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-websocket all 1.2.3-1 [34.7 kB] 2026-03-08T22:43:41.562 INFO:teuthology.orchestra.run.vm03.stdout:Get:56 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-kubernetes all 12.0.1-1ubuntu1 [353 kB] 2026-03-08T22:43:41.564 INFO:teuthology.orchestra.run.vm03.stdout:Get:57 https://archive.ubuntu.com/ubuntu jammy/main amd64 libonig5 amd64 6.9.7.1-2build1 [172 kB] 2026-03-08T22:43:41.568 INFO:teuthology.orchestra.run.vm03.stdout:Get:58 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 libjq1 amd64 1.6-2.1ubuntu3.1 [133 kB] 2026-03-08T22:43:41.569 INFO:teuthology.orchestra.run.vm03.stdout:Get:59 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 jq amd64 1.6-2.1ubuntu3.1 [52.5 kB] 2026-03-08T22:43:41.573 INFO:teuthology.orchestra.run.vm03.stdout:Get:60 https://archive.ubuntu.com/ubuntu jammy/main amd64 socat amd64 1.7.4.1-3ubuntu4 [349 kB] 2026-03-08T22:43:41.576 INFO:teuthology.orchestra.run.vm03.stdout:Get:61 https://archive.ubuntu.com/ubuntu jammy/universe amd64 xmlstarlet amd64 1.6.1-2.1 [265 kB] 2026-03-08T22:43:41.578 INFO:teuthology.orchestra.run.vm03.stdout:Get:62 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-socket amd64 3.0~rc1+git+ac3201d-6 [78.9 kB] 2026-03-08T22:43:41.579 INFO:teuthology.orchestra.run.vm03.stdout:Get:63 https://archive.ubuntu.com/ubuntu jammy/universe amd64 lua-sec amd64 1.0.2-1 [37.6 kB] 2026-03-08T22:43:41.579 INFO:teuthology.orchestra.run.vm03.stdout:Get:64 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 nvme-cli amd64 1.16-3ubuntu0.3 [474 kB] 2026-03-08T22:43:41.601 INFO:teuthology.orchestra.run.vm03.stdout:Get:65 https://archive.ubuntu.com/ubuntu jammy/main amd64 pkg-config amd64 0.29.2-1ubuntu3 [48.2 kB] 2026-03-08T22:43:41.601 INFO:teuthology.orchestra.run.vm03.stdout:Get:66 https://archive.ubuntu.com/ubuntu jammy-updates/universe amd64 python-asyncssh-doc all 2.5.0-1ubuntu0.1 [309 kB] 2026-03-08T22:43:41.603 INFO:teuthology.orchestra.run.vm03.stdout:Get:67 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-iniconfig all 1.1.1-2 [6024 B] 2026-03-08T22:43:41.603 INFO:teuthology.orchestra.run.vm03.stdout:Get:68 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pastescript all 2.0.2-4 [54.6 kB] 2026-03-08T22:43:41.604 INFO:teuthology.orchestra.run.vm03.stdout:Get:69 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pluggy all 0.13.0-7.1 [19.0 kB] 2026-03-08T22:43:41.604 INFO:teuthology.orchestra.run.vm03.stdout:Get:70 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-psutil amd64 5.9.0-1build1 [158 kB] 2026-03-08T22:43:41.605 INFO:teuthology.orchestra.run.vm03.stdout:Get:71 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-py all 1.10.0-1 [71.9 kB] 2026-03-08T22:43:41.605 INFO:teuthology.orchestra.run.vm03.stdout:Get:72 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 python3-pygments all 2.11.2+dfsg-2ubuntu0.1 [750 kB] 2026-03-08T22:43:41.609 INFO:teuthology.orchestra.run.vm03.stdout:Get:73 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-pyinotify all 0.9.6-1.3 [24.8 kB] 2026-03-08T22:43:41.609 INFO:teuthology.orchestra.run.vm03.stdout:Get:74 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-toml all 0.10.2-1 [16.5 kB] 2026-03-08T22:43:41.616 INFO:teuthology.orchestra.run.vm03.stdout:Get:75 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-pytest all 6.2.5-1ubuntu2 [214 kB] 2026-03-08T22:43:41.618 INFO:teuthology.orchestra.run.vm03.stdout:Get:76 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-simplejson amd64 3.17.6-1build1 [54.7 kB] 2026-03-08T22:43:41.619 INFO:teuthology.orchestra.run.vm03.stdout:Get:77 https://archive.ubuntu.com/ubuntu jammy/universe amd64 qttranslations5-l10n all 5.15.3-1 [1983 kB] 2026-03-08T22:43:41.646 INFO:teuthology.orchestra.run.vm03.stdout:Get:78 https://archive.ubuntu.com/ubuntu jammy-updates/main amd64 smartmontools amd64 7.2-1ubuntu0.1 [583 kB] 2026-03-08T22:43:41.871 INFO:teuthology.orchestra.run.vm03.stdout:Get:79 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librbd1 amd64 19.2.3-678-ge911bdeb-1jammy [3257 kB] 2026-03-08T22:43:42.664 INFO:teuthology.orchestra.run.vm03.stdout:Get:80 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librados2 amd64 19.2.3-678-ge911bdeb-1jammy [3597 kB] 2026-03-08T22:43:42.784 INFO:teuthology.orchestra.run.vm03.stdout:Get:81 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs2 amd64 19.2.3-678-ge911bdeb-1jammy [979 kB] 2026-03-08T22:43:42.902 INFO:teuthology.orchestra.run.vm03.stdout:Get:82 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rados amd64 19.2.3-678-ge911bdeb-1jammy [357 kB] 2026-03-08T22:43:42.903 INFO:teuthology.orchestra.run.vm03.stdout:Get:83 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-argparse all 19.2.3-678-ge911bdeb-1jammy [32.9 kB] 2026-03-08T22:43:42.903 INFO:teuthology.orchestra.run.vm03.stdout:Get:84 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-cephfs amd64 19.2.3-678-ge911bdeb-1jammy [184 kB] 2026-03-08T22:43:42.904 INFO:teuthology.orchestra.run.vm03.stdout:Get:85 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-ceph-common all 19.2.3-678-ge911bdeb-1jammy [70.1 kB] 2026-03-08T22:43:42.904 INFO:teuthology.orchestra.run.vm03.stdout:Get:86 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rbd amd64 19.2.3-678-ge911bdeb-1jammy [334 kB] 2026-03-08T22:43:42.906 INFO:teuthology.orchestra.run.vm03.stdout:Get:87 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 librgw2 amd64 19.2.3-678-ge911bdeb-1jammy [6935 kB] 2026-03-08T22:43:43.263 INFO:teuthology.orchestra.run.vm03.stdout:Get:88 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 python3-rgw amd64 19.2.3-678-ge911bdeb-1jammy [112 kB] 2026-03-08T22:43:43.264 INFO:teuthology.orchestra.run.vm03.stdout:Get:89 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libradosstriper1 amd64 19.2.3-678-ge911bdeb-1jammy [470 kB] 2026-03-08T22:43:43.274 INFO:teuthology.orchestra.run.vm03.stdout:Get:90 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-common amd64 19.2.3-678-ge911bdeb-1jammy [26.5 MB] 2026-03-08T22:43:44.286 INFO:teuthology.orchestra.run.vm03.stdout:Get:91 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-base amd64 19.2.3-678-ge911bdeb-1jammy [5178 kB] 2026-03-08T22:43:44.465 INFO:teuthology.orchestra.run.vm03.stdout:Get:92 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-modules-core all 19.2.3-678-ge911bdeb-1jammy [248 kB] 2026-03-08T22:43:44.466 INFO:teuthology.orchestra.run.vm03.stdout:Get:93 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libsqlite3-mod-ceph amd64 19.2.3-678-ge911bdeb-1jammy [125 kB] 2026-03-08T22:43:44.467 INFO:teuthology.orchestra.run.vm03.stdout:Get:94 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr amd64 19.2.3-678-ge911bdeb-1jammy [1081 kB] 2026-03-08T22:43:44.518 INFO:teuthology.orchestra.run.vm03.stdout:Get:95 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mon amd64 19.2.3-678-ge911bdeb-1jammy [6239 kB] 2026-03-08T22:43:44.749 INFO:teuthology.orchestra.run.vm03.stdout:Get:96 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-osd amd64 19.2.3-678-ge911bdeb-1jammy [23.0 MB] 2026-03-08T22:43:45.564 INFO:teuthology.orchestra.run.vm03.stdout:Get:97 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph amd64 19.2.3-678-ge911bdeb-1jammy [14.2 kB] 2026-03-08T22:43:45.574 INFO:teuthology.orchestra.run.vm03.stdout:Get:98 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-fuse amd64 19.2.3-678-ge911bdeb-1jammy [1173 kB] 2026-03-08T22:43:45.619 INFO:teuthology.orchestra.run.vm03.stdout:Get:99 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mds amd64 19.2.3-678-ge911bdeb-1jammy [2503 kB] 2026-03-08T22:43:45.708 INFO:teuthology.orchestra.run.vm03.stdout:Get:100 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 cephadm amd64 19.2.3-678-ge911bdeb-1jammy [798 kB] 2026-03-08T22:43:45.736 INFO:teuthology.orchestra.run.vm03.stdout:Get:101 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-cephadm all 19.2.3-678-ge911bdeb-1jammy [157 kB] 2026-03-08T22:43:45.737 INFO:teuthology.orchestra.run.vm03.stdout:Get:102 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-dashboard all 19.2.3-678-ge911bdeb-1jammy [2396 kB] 2026-03-08T22:43:45.846 INFO:teuthology.orchestra.run.vm03.stdout:Get:103 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-diskprediction-local all 19.2.3-678-ge911bdeb-1jammy [8625 kB] 2026-03-08T22:43:46.152 INFO:teuthology.orchestra.run.vm03.stdout:Get:104 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-mgr-k8sevents all 19.2.3-678-ge911bdeb-1jammy [14.3 kB] 2026-03-08T22:43:46.152 INFO:teuthology.orchestra.run.vm03.stdout:Get:105 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-test amd64 19.2.3-678-ge911bdeb-1jammy [52.1 MB] 2026-03-08T22:43:48.184 INFO:teuthology.orchestra.run.vm03.stdout:Get:106 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 ceph-volume all 19.2.3-678-ge911bdeb-1jammy [135 kB] 2026-03-08T22:43:48.185 INFO:teuthology.orchestra.run.vm03.stdout:Get:107 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 libcephfs-dev amd64 19.2.3-678-ge911bdeb-1jammy [41.0 kB] 2026-03-08T22:43:48.185 INFO:teuthology.orchestra.run.vm03.stdout:Get:108 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 radosgw amd64 19.2.3-678-ge911bdeb-1jammy [13.7 MB] 2026-03-08T22:43:48.662 INFO:teuthology.orchestra.run.vm03.stdout:Get:109 https://1.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/ubuntu/jammy/flavors/default jammy/main amd64 rbd-fuse amd64 19.2.3-678-ge911bdeb-1jammy [92.2 kB] 2026-03-08T22:43:48.986 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 178 MB in 7s (24.2 MB/s) 2026-03-08T22:43:49.250 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package liblttng-ust1:amd64. 2026-03-08T22:43:49.282 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 111717 files and directories currently installed.) 2026-03-08T22:43:49.284 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../000-liblttng-ust1_2.13.1-1ubuntu1_amd64.deb ... 2026-03-08T22:43:49.286 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:43:49.306 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libdouble-conversion3:amd64. 2026-03-08T22:43:49.312 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../001-libdouble-conversion3_3.1.7-4_amd64.deb ... 2026-03-08T22:43:49.313 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:43:49.329 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libpcre2-16-0:amd64. 2026-03-08T22:43:49.335 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../002-libpcre2-16-0_10.39-3ubuntu0.1_amd64.deb ... 2026-03-08T22:43:49.335 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:43:49.356 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5core5a:amd64. 2026-03-08T22:43:49.362 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../003-libqt5core5a_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:43:49.366 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:43:49.407 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5dbus5:amd64. 2026-03-08T22:43:49.413 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../004-libqt5dbus5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:43:49.413 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:43:49.434 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libqt5network5:amd64. 2026-03-08T22:43:49.441 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../005-libqt5network5_5.15.3+dfsg-2ubuntu0.2_amd64.deb ... 2026-03-08T22:43:49.442 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:43:49.471 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libthrift-0.16.0:amd64. 2026-03-08T22:43:49.476 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../006-libthrift-0.16.0_0.16.0-2_amd64.deb ... 2026-03-08T22:43:49.477 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:43:49.502 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../007-librbd1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.504 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librbd1 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:43:49.619 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../008-librados2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.621 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librados2 (19.2.3-678-ge911bdeb-1jammy) over (17.2.9-0ubuntu0.22.04.2) ... 2026-03-08T22:43:49.699 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libnbd0. 2026-03-08T22:43:49.700 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../009-libnbd0_1.10.5-1_amd64.deb ... 2026-03-08T22:43:49.701 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libnbd0 (1.10.5-1) ... 2026-03-08T22:43:49.720 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs2. 2026-03-08T22:43:49.727 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../010-libcephfs2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.728 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.755 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rados. 2026-03-08T22:43:49.760 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../011-python3-rados_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.761 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.779 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-ceph-argparse. 2026-03-08T22:43:49.785 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../012-python3-ceph-argparse_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:49.785 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.801 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cephfs. 2026-03-08T22:43:49.807 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../013-python3-cephfs_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.808 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.826 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-ceph-common. 2026-03-08T22:43:49.831 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../014-python3-ceph-common_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:49.832 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.851 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-wcwidth. 2026-03-08T22:43:49.856 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../015-python3-wcwidth_0.2.5+dfsg1-1_all.deb ... 2026-03-08T22:43:49.857 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:43:49.875 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-prettytable. 2026-03-08T22:43:49.880 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../016-python3-prettytable_2.5.0-2_all.deb ... 2026-03-08T22:43:49.881 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-prettytable (2.5.0-2) ... 2026-03-08T22:43:49.898 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rbd. 2026-03-08T22:43:49.904 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../017-python3-rbd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:49.905 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:49.927 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package librdkafka1:amd64. 2026-03-08T22:43:49.935 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../018-librdkafka1_1.8.0-1build1_amd64.deb ... 2026-03-08T22:43:49.936 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:43:49.962 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libreadline-dev:amd64. 2026-03-08T22:43:49.968 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../019-libreadline-dev_8.1.2-1_amd64.deb ... 2026-03-08T22:43:49.968 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:43:49.989 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package liblua5.3-dev:amd64. 2026-03-08T22:43:49.995 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../020-liblua5.3-dev_5.3.6-1build1_amd64.deb ... 2026-03-08T22:43:49.995 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:43:50.017 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package lua5.1. 2026-03-08T22:43:50.022 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../021-lua5.1_5.1.5-8.1build4_amd64.deb ... 2026-03-08T22:43:50.023 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:43:50.042 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package lua-any. 2026-03-08T22:43:50.046 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../022-lua-any_27ubuntu1_all.deb ... 2026-03-08T22:43:50.047 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking lua-any (27ubuntu1) ... 2026-03-08T22:43:50.059 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package zip. 2026-03-08T22:43:50.063 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../023-zip_3.0-12build2_amd64.deb ... 2026-03-08T22:43:50.064 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking zip (3.0-12build2) ... 2026-03-08T22:43:50.080 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package unzip. 2026-03-08T22:43:50.085 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../024-unzip_6.0-26ubuntu3.2_amd64.deb ... 2026-03-08T22:43:50.086 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:43:50.102 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package luarocks. 2026-03-08T22:43:50.106 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../025-luarocks_3.8.0+dfsg1-1_all.deb ... 2026-03-08T22:43:50.107 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:43:50.154 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package librgw2. 2026-03-08T22:43:50.159 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../026-librgw2_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:50.160 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:50.282 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rgw. 2026-03-08T22:43:50.288 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../027-python3-rgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:50.289 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:50.308 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package liboath0:amd64. 2026-03-08T22:43:50.313 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../028-liboath0_2.6.7-3ubuntu0.1_amd64.deb ... 2026-03-08T22:43:50.313 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:43:50.327 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libradosstriper1. 2026-03-08T22:43:50.332 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../029-libradosstriper1_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:50.332 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:50.358 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-common. 2026-03-08T22:43:50.363 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../030-ceph-common_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:50.364 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:50.765 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-base. 2026-03-08T22:43:50.770 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../031-ceph-base_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:50.775 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:50.889 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.functools. 2026-03-08T22:43:50.895 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../032-python3-jaraco.functools_3.4.0-2_all.deb ... 2026-03-08T22:43:50.896 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:43:50.910 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cheroot. 2026-03-08T22:43:50.915 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../033-python3-cheroot_8.5.2+ds1-1ubuntu3.1_all.deb ... 2026-03-08T22:43:50.916 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:43:50.939 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.classes. 2026-03-08T22:43:50.944 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../034-python3-jaraco.classes_3.2.1-3_all.deb ... 2026-03-08T22:43:50.945 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:43:50.959 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.text. 2026-03-08T22:43:50.965 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../035-python3-jaraco.text_3.6.0-2_all.deb ... 2026-03-08T22:43:50.965 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:43:50.980 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jaraco.collections. 2026-03-08T22:43:50.985 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../036-python3-jaraco.collections_3.4.0-2_all.deb ... 2026-03-08T22:43:50.986 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:43:51.001 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-tempora. 2026-03-08T22:43:51.006 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../037-python3-tempora_4.1.2-1_all.deb ... 2026-03-08T22:43:51.008 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-tempora (4.1.2-1) ... 2026-03-08T22:43:51.026 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-portend. 2026-03-08T22:43:51.032 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../038-python3-portend_3.0.0-1_all.deb ... 2026-03-08T22:43:51.033 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-portend (3.0.0-1) ... 2026-03-08T22:43:51.050 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-zc.lockfile. 2026-03-08T22:43:51.057 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../039-python3-zc.lockfile_2.0-1_all.deb ... 2026-03-08T22:43:51.058 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-zc.lockfile (2.0-1) ... 2026-03-08T22:43:51.077 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cherrypy3. 2026-03-08T22:43:51.082 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../040-python3-cherrypy3_18.6.1-4_all.deb ... 2026-03-08T22:43:51.084 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:43:51.112 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-natsort. 2026-03-08T22:43:51.118 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../041-python3-natsort_8.0.2-1_all.deb ... 2026-03-08T22:43:51.119 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-natsort (8.0.2-1) ... 2026-03-08T22:43:51.138 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-logutils. 2026-03-08T22:43:51.143 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../042-python3-logutils_0.3.3-8_all.deb ... 2026-03-08T22:43:51.144 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-logutils (0.3.3-8) ... 2026-03-08T22:43:51.161 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-mako. 2026-03-08T22:43:51.167 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../043-python3-mako_1.1.3+ds1-2ubuntu0.1_all.deb ... 2026-03-08T22:43:51.168 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:43:51.190 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-simplegeneric. 2026-03-08T22:43:51.196 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../044-python3-simplegeneric_0.8.1-3_all.deb ... 2026-03-08T22:43:51.197 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:43:51.212 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-singledispatch. 2026-03-08T22:43:51.218 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../045-python3-singledispatch_3.4.0.3-3_all.deb ... 2026-03-08T22:43:51.219 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:43:51.234 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-webob. 2026-03-08T22:43:51.240 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../046-python3-webob_1%3a1.8.6-1.1ubuntu0.1_all.deb ... 2026-03-08T22:43:51.241 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:43:51.263 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-waitress. 2026-03-08T22:43:51.269 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../047-python3-waitress_1.4.4-1.1ubuntu1.1_all.deb ... 2026-03-08T22:43:51.272 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:43:51.290 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-tempita. 2026-03-08T22:43:51.296 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../048-python3-tempita_0.5.2-6ubuntu1_all.deb ... 2026-03-08T22:43:51.297 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:43:51.314 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-paste. 2026-03-08T22:43:51.320 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../049-python3-paste_3.5.0+dfsg1-1_all.deb ... 2026-03-08T22:43:51.321 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:43:51.373 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python-pastedeploy-tpl. 2026-03-08T22:43:51.380 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../050-python-pastedeploy-tpl_2.1.1-1_all.deb ... 2026-03-08T22:43:51.381 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:43:51.397 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pastedeploy. 2026-03-08T22:43:51.403 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../051-python3-pastedeploy_2.1.1-1_all.deb ... 2026-03-08T22:43:51.404 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:43:51.423 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-webtest. 2026-03-08T22:43:51.429 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../052-python3-webtest_2.0.35-1_all.deb ... 2026-03-08T22:43:51.430 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-webtest (2.0.35-1) ... 2026-03-08T22:43:51.449 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pecan. 2026-03-08T22:43:51.458 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../053-python3-pecan_1.3.3-4ubuntu2_all.deb ... 2026-03-08T22:43:51.458 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:43:51.491 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-werkzeug. 2026-03-08T22:43:51.497 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../054-python3-werkzeug_2.0.2+dfsg1-1ubuntu0.22.04.3_all.deb ... 2026-03-08T22:43:51.498 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:43:51.523 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-modules-core. 2026-03-08T22:43:51.528 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../055-ceph-mgr-modules-core_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:51.529 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:51.568 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libsqlite3-mod-ceph. 2026-03-08T22:43:51.573 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../056-libsqlite3-mod-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:51.574 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:51.591 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr. 2026-03-08T22:43:51.597 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../057-ceph-mgr_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:51.598 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:51.629 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mon. 2026-03-08T22:43:51.634 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../058-ceph-mon_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:51.635 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:51.739 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libfuse2:amd64. 2026-03-08T22:43:51.745 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../059-libfuse2_2.9.9-5ubuntu3_amd64.deb ... 2026-03-08T22:43:51.745 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:43:51.763 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-osd. 2026-03-08T22:43:51.769 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../060-ceph-osd_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:51.770 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.086 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph. 2026-03-08T22:43:52.092 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../061-ceph_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:52.093 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.109 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-fuse. 2026-03-08T22:43:52.115 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../062-ceph-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:52.116 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.152 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mds. 2026-03-08T22:43:52.152 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../063-ceph-mds_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:52.152 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.224 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package cephadm. 2026-03-08T22:43:52.228 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../064-cephadm_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:52.230 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.262 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-asyncssh. 2026-03-08T22:43:52.268 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../065-python3-asyncssh_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:43:52.271 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:43:52.320 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-cephadm. 2026-03-08T22:43:52.323 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../066-ceph-mgr-cephadm_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:52.326 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.364 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-repoze.lru. 2026-03-08T22:43:52.365 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../067-python3-repoze.lru_0.7-2_all.deb ... 2026-03-08T22:43:52.367 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-repoze.lru (0.7-2) ... 2026-03-08T22:43:52.397 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-routes. 2026-03-08T22:43:52.403 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../068-python3-routes_2.5.1-1ubuntu1_all.deb ... 2026-03-08T22:43:52.406 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:43:52.439 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-dashboard. 2026-03-08T22:43:52.445 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../069-ceph-mgr-dashboard_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:52.446 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:52.864 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-sklearn-lib:amd64. 2026-03-08T22:43:52.869 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../070-python3-sklearn-lib_0.23.2-5ubuntu6_amd64.deb ... 2026-03-08T22:43:52.872 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:43:52.959 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-joblib. 2026-03-08T22:43:52.965 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../071-python3-joblib_0.17.0-4ubuntu1_all.deb ... 2026-03-08T22:43:52.967 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:43:53.020 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-threadpoolctl. 2026-03-08T22:43:53.027 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../072-python3-threadpoolctl_3.1.0-1_all.deb ... 2026-03-08T22:43:53.028 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:43:53.068 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-sklearn. 2026-03-08T22:43:53.075 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../073-python3-sklearn_0.23.2-5ubuntu6_all.deb ... 2026-03-08T22:43:53.077 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:43:53.227 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-diskprediction-local. 2026-03-08T22:43:53.234 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../074-ceph-mgr-diskprediction-local_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:53.235 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:53.534 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-cachetools. 2026-03-08T22:43:53.540 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../075-python3-cachetools_5.0.0-1_all.deb ... 2026-03-08T22:43:53.541 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-cachetools (5.0.0-1) ... 2026-03-08T22:43:53.562 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-rsa. 2026-03-08T22:43:53.569 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../076-python3-rsa_4.8-1_all.deb ... 2026-03-08T22:43:53.570 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-rsa (4.8-1) ... 2026-03-08T22:43:53.592 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-google-auth. 2026-03-08T22:43:53.598 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../077-python3-google-auth_1.5.1-3_all.deb ... 2026-03-08T22:43:53.598 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-google-auth (1.5.1-3) ... 2026-03-08T22:43:53.623 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-requests-oauthlib. 2026-03-08T22:43:53.630 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../078-python3-requests-oauthlib_1.3.0+ds-0.1_all.deb ... 2026-03-08T22:43:53.631 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:43:53.653 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-websocket. 2026-03-08T22:43:53.659 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../079-python3-websocket_1.2.3-1_all.deb ... 2026-03-08T22:43:53.660 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-websocket (1.2.3-1) ... 2026-03-08T22:43:53.680 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-kubernetes. 2026-03-08T22:43:53.686 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../080-python3-kubernetes_12.0.1-1ubuntu1_all.deb ... 2026-03-08T22:43:53.703 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:43:53.859 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-mgr-k8sevents. 2026-03-08T22:43:53.866 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../081-ceph-mgr-k8sevents_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:53.867 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:53.885 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libonig5:amd64. 2026-03-08T22:43:53.891 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../082-libonig5_6.9.7.1-2build1_amd64.deb ... 2026-03-08T22:43:53.892 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:43:53.912 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libjq1:amd64. 2026-03-08T22:43:53.918 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../083-libjq1_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:43:53.919 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:43:53.938 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package jq. 2026-03-08T22:43:53.945 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../084-jq_1.6-2.1ubuntu3.1_amd64.deb ... 2026-03-08T22:43:53.946 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:43:53.962 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package socat. 2026-03-08T22:43:53.968 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../085-socat_1.7.4.1-3ubuntu4_amd64.deb ... 2026-03-08T22:43:53.968 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:43:53.995 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package xmlstarlet. 2026-03-08T22:43:54.002 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../086-xmlstarlet_1.6.1-2.1_amd64.deb ... 2026-03-08T22:43:54.002 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:43:54.048 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-test. 2026-03-08T22:43:54.054 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../087-ceph-test_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:54.055 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:54.924 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package ceph-volume. 2026-03-08T22:43:54.931 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../088-ceph-volume_19.2.3-678-ge911bdeb-1jammy_all.deb ... 2026-03-08T22:43:54.932 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:54.963 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package libcephfs-dev. 2026-03-08T22:43:54.970 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../089-libcephfs-dev_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:54.971 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:54.988 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package lua-socket:amd64. 2026-03-08T22:43:54.995 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../090-lua-socket_3.0~rc1+git+ac3201d-6_amd64.deb ... 2026-03-08T22:43:54.996 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:43:55.023 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package lua-sec:amd64. 2026-03-08T22:43:55.028 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../091-lua-sec_1.0.2-1_amd64.deb ... 2026-03-08T22:43:55.030 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:43:55.049 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package nvme-cli. 2026-03-08T22:43:55.056 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../092-nvme-cli_1.16-3ubuntu0.3_amd64.deb ... 2026-03-08T22:43:55.057 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:43:55.102 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package pkg-config. 2026-03-08T22:43:55.108 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../093-pkg-config_0.29.2-1ubuntu3_amd64.deb ... 2026-03-08T22:43:55.109 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:43:55.127 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python-asyncssh-doc. 2026-03-08T22:43:55.132 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../094-python-asyncssh-doc_2.5.0-1ubuntu0.1_all.deb ... 2026-03-08T22:43:55.133 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:43:55.180 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-iniconfig. 2026-03-08T22:43:55.187 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../095-python3-iniconfig_1.1.1-2_all.deb ... 2026-03-08T22:43:55.188 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-iniconfig (1.1.1-2) ... 2026-03-08T22:43:55.204 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pastescript. 2026-03-08T22:43:55.210 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../096-python3-pastescript_2.0.2-4_all.deb ... 2026-03-08T22:43:55.210 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pastescript (2.0.2-4) ... 2026-03-08T22:43:55.233 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pluggy. 2026-03-08T22:43:55.238 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../097-python3-pluggy_0.13.0-7.1_all.deb ... 2026-03-08T22:43:55.239 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:43:55.259 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-psutil. 2026-03-08T22:43:55.264 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../098-python3-psutil_5.9.0-1build1_amd64.deb ... 2026-03-08T22:43:55.265 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-psutil (5.9.0-1build1) ... 2026-03-08T22:43:55.291 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-py. 2026-03-08T22:43:55.298 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../099-python3-py_1.10.0-1_all.deb ... 2026-03-08T22:43:55.299 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-py (1.10.0-1) ... 2026-03-08T22:43:55.324 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pygments. 2026-03-08T22:43:55.330 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../100-python3-pygments_2.11.2+dfsg-2ubuntu0.1_all.deb ... 2026-03-08T22:43:55.331 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:43:55.406 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pyinotify. 2026-03-08T22:43:55.415 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../101-python3-pyinotify_0.9.6-1.3_all.deb ... 2026-03-08T22:43:55.416 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:43:55.445 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-toml. 2026-03-08T22:43:55.451 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../102-python3-toml_0.10.2-1_all.deb ... 2026-03-08T22:43:55.452 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-toml (0.10.2-1) ... 2026-03-08T22:43:55.487 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-pytest. 2026-03-08T22:43:55.494 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../103-python3-pytest_6.2.5-1ubuntu2_all.deb ... 2026-03-08T22:43:55.497 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:43:55.545 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-simplejson. 2026-03-08T22:43:55.547 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../104-python3-simplejson_3.17.6-1build1_amd64.deb ... 2026-03-08T22:43:55.550 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:43:55.590 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package qttranslations5-l10n. 2026-03-08T22:43:55.595 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../105-qttranslations5-l10n_5.15.3-1_all.deb ... 2026-03-08T22:43:55.599 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:43:55.734 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package radosgw. 2026-03-08T22:43:55.739 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../106-radosgw_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:55.740 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:55.991 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package rbd-fuse. 2026-03-08T22:43:56.000 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../107-rbd-fuse_19.2.3-678-ge911bdeb-1jammy_amd64.deb ... 2026-03-08T22:43:56.006 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:56.067 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package smartmontools. 2026-03-08T22:43:56.073 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../108-smartmontools_7.2-1ubuntu0.1_amd64.deb ... 2026-03-08T22:43:56.083 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:43:56.175 INFO:teuthology.orchestra.run.vm03.stdout:Setting up smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T22:43:56.446 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/smartd.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:43:56.446 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/smartmontools.service → /lib/systemd/system/smartmontools.service. 2026-03-08T22:43:56.820 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-iniconfig (1.1.1-2) ... 2026-03-08T22:43:56.894 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T22:43:56.905 INFO:teuthology.orchestra.run.vm03.stdout:Setting up nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T22:43:56.983 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmefc-boot-connections.service → /lib/systemd/system/nvmefc-boot-connections.service. 2026-03-08T22:43:57.221 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/default.target.wants/nvmf-autoconnect.service → /lib/systemd/system/nvmf-autoconnect.service. 2026-03-08T22:43:57.616 INFO:teuthology.orchestra.run.vm03.stdout:nvmf-connect.target is a disabled or a static unit, not starting it. 2026-03-08T22:43:57.622 INFO:teuthology.orchestra.run.vm03.stdout:Could not execute systemctl: at /usr/bin/deb-systemd-invoke line 142. 2026-03-08T22:43:57.624 INFO:teuthology.orchestra.run.vm03.stdout:Setting up cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:57.669 INFO:teuthology.orchestra.run.vm03.stdout:Adding system user cephadm....done 2026-03-08T22:43:57.678 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T22:43:57.754 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.classes (3.2.1-3) ... 2026-03-08T22:43:57.820 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T22:43:57.822 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.functools (3.4.0-2) ... 2026-03-08T22:43:57.887 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-repoze.lru (0.7-2) ... 2026-03-08T22:43:57.958 INFO:teuthology.orchestra.run.vm03.stdout:Setting up liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T22:43:57.962 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-py (1.10.0-1) ... 2026-03-08T22:43:58.059 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T22:43:58.184 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cachetools (5.0.0-1) ... 2026-03-08T22:43:58.257 INFO:teuthology.orchestra.run.vm03.stdout:Setting up unzip (6.0-26ubuntu3.2) ... 2026-03-08T22:43:58.266 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pyinotify (0.9.6-1.3) ... 2026-03-08T22:43:58.340 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-threadpoolctl (3.1.0-1) ... 2026-03-08T22:43:58.406 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:43:58.479 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T22:43:58.481 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libnbd0 (1.10.5-1) ... 2026-03-08T22:43:58.483 INFO:teuthology.orchestra.run.vm03.stdout:Setting up lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T22:43:58.486 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T22:43:58.488 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T22:43:58.490 INFO:teuthology.orchestra.run.vm03.stdout:Setting up lua5.1 (5.1.5-8.1build4) ... 2026-03-08T22:43:58.494 INFO:teuthology.orchestra.run.vm03.stdout:update-alternatives: using /usr/bin/lua5.1 to provide /usr/bin/lua (lua-interpreter) in auto mode 2026-03-08T22:43:58.497 INFO:teuthology.orchestra.run.vm03.stdout:update-alternatives: using /usr/bin/luac5.1 to provide /usr/bin/luac (lua-compiler) in auto mode 2026-03-08T22:43:58.499 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T22:43:58.501 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-psutil (5.9.0-1build1) ... 2026-03-08T22:43:58.629 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-natsort (8.0.2-1) ... 2026-03-08T22:43:58.714 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T22:43:58.800 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-simplejson (3.17.6-1build1) ... 2026-03-08T22:43:58.895 INFO:teuthology.orchestra.run.vm03.stdout:Setting up zip (3.0-12build2) ... 2026-03-08T22:43:58.904 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pygments (2.11.2+dfsg-2ubuntu0.1) ... 2026-03-08T22:43:59.207 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T22:43:59.286 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T22:43:59.293 INFO:teuthology.orchestra.run.vm03.stdout:Setting up qttranslations5-l10n (5.15.3-1) ... 2026-03-08T22:43:59.295 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T22:43:59.387 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T22:43:59.526 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T22:43:59.655 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T22:43:59.800 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T22:43:59.916 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.text (3.6.0-2) ... 2026-03-08T22:43:59.983 INFO:teuthology.orchestra.run.vm03.stdout:Setting up socat (1.7.4.1-3ubuntu4) ... 2026-03-08T22:43:59.985 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:00.082 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T22:44:00.648 INFO:teuthology.orchestra.run.vm03.stdout:Setting up pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T22:44:00.671 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:00.676 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-toml (0.10.2-1) ... 2026-03-08T22:44:00.751 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T22:44:00.753 INFO:teuthology.orchestra.run.vm03.stdout:Setting up xmlstarlet (1.6.1-2.1) ... 2026-03-08T22:44:00.755 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pluggy (0.13.0-7.1) ... 2026-03-08T22:44:00.831 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-zc.lockfile (2.0-1) ... 2026-03-08T22:44:00.902 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:00.904 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rsa (4.8-1) ... 2026-03-08T22:44:00.976 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-singledispatch (3.4.0.3-3) ... 2026-03-08T22:44:01.046 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-logutils (0.3.3-8) ... 2026-03-08T22:44:01.116 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-tempora (4.1.2-1) ... 2026-03-08T22:44:01.186 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-simplegeneric (0.8.1-3) ... 2026-03-08T22:44:01.251 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-prettytable (2.5.0-2) ... 2026-03-08T22:44:01.322 INFO:teuthology.orchestra.run.vm03.stdout:Setting up liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T22:44:01.324 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-websocket (1.2.3-1) ... 2026-03-08T22:44:01.402 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T22:44:01.404 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T22:44:01.474 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T22:44:01.560 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T22:44:01.658 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jaraco.collections (3.4.0-2) ... 2026-03-08T22:44:01.729 INFO:teuthology.orchestra.run.vm03.stdout:Setting up liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T22:44:01.730 INFO:teuthology.orchestra.run.vm03.stdout:Setting up lua-sec:amd64 (1.0.2-1) ... 2026-03-08T22:44:01.732 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T22:44:01.734 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pytest (6.2.5-1ubuntu2) ... 2026-03-08T22:44:01.882 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pastedeploy (2.1.1-1) ... 2026-03-08T22:44:01.953 INFO:teuthology.orchestra.run.vm03.stdout:Setting up lua-any (27ubuntu1) ... 2026-03-08T22:44:01.955 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-portend (3.0.0-1) ... 2026-03-08T22:44:02.025 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T22:44:02.027 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-google-auth (1.5.1-3) ... 2026-03-08T22:44:02.117 INFO:teuthology.orchestra.run.vm03.stdout:Setting up jq (1.6-2.1ubuntu3.1) ... 2026-03-08T22:44:02.121 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-webtest (2.0.35-1) ... 2026-03-08T22:44:02.205 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cherrypy3 (18.6.1-4) ... 2026-03-08T22:44:02.346 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pastescript (2.0.2-4) ... 2026-03-08T22:44:02.438 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T22:44:02.550 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T22:44:02.558 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:02.566 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:02.569 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T22:44:03.175 INFO:teuthology.orchestra.run.vm03.stdout:Setting up luarocks (3.8.0+dfsg1-1) ... 2026-03-08T22:44:03.191 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.199 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.207 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.212 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.220 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.291 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/remote-fs.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:44:03.291 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-fuse.target → /lib/systemd/system/ceph-fuse.target. 2026-03-08T22:44:03.635 INFO:teuthology.orchestra.run.vm03.stdout:Setting up libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.637 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.639 INFO:teuthology.orchestra.run.vm03.stdout:Setting up librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.641 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.643 INFO:teuthology.orchestra.run.vm03.stdout:Setting up rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.644 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.646 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.648 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:03.682 INFO:teuthology.orchestra.run.vm03.stdout:Adding group ceph....done 2026-03-08T22:44:03.719 INFO:teuthology.orchestra.run.vm03.stdout:Adding system user ceph....done 2026-03-08T22:44:03.728 INFO:teuthology.orchestra.run.vm03.stdout:Setting system user ceph properties....done 2026-03-08T22:44:03.733 INFO:teuthology.orchestra.run.vm03.stdout:chown: cannot access '/var/log/ceph/*.log*': No such file or directory 2026-03-08T22:44:03.799 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /lib/systemd/system/ceph.target. 2026-03-08T22:44:04.070 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/rbdmap.service → /lib/systemd/system/rbdmap.service. 2026-03-08T22:44:04.419 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:04.421 INFO:teuthology.orchestra.run.vm03.stdout:Setting up radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:04.670 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:44:04.670 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /lib/systemd/system/ceph-radosgw.target. 2026-03-08T22:44:05.070 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.167 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /lib/systemd/system/ceph-crash.service. 2026-03-08T22:44:05.572 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:05.650 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:44:05.650 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /lib/systemd/system/ceph-mds.target. 2026-03-08T22:44:06.000 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.067 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:44:06.067 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /lib/systemd/system/ceph-mgr.target. 2026-03-08T22:44:06.445 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.527 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:44:06.527 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /lib/systemd/system/ceph-osd.target. 2026-03-08T22:44:06.921 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.923 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.936 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:06.996 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:44:06.997 INFO:teuthology.orchestra.run.vm03.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /lib/systemd/system/ceph-mon.target. 2026-03-08T22:44:07.372 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:07.384 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:07.386 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:07.400 INFO:teuthology.orchestra.run.vm03.stdout:Setting up ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T22:44:07.521 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T22:44:07.529 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T22:44:07.543 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T22:44:07.629 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for install-info (6.8-4build1) ... 2026-03-08T22:44:08.006 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.006 INFO:teuthology.orchestra.run.vm03.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:44:08.006 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.006 INFO:teuthology.orchestra.run.vm03.stdout:Services to be restarted: 2026-03-08T22:44:08.009 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart packagekit.service 2026-03-08T22:44:08.013 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout:Service restarts being deferred: 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout:No containers need to be restarted. 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout:No user sessions are running outdated binaries. 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:08.014 INFO:teuthology.orchestra.run.vm03.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:44:08.944 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:08.947 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" install python3-xmltodict python3-jmespath 2026-03-08T22:44:09.030 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T22:44:09.240 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T22:44:09.240 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T22:44:09.405 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T22:44:09.406 INFO:teuthology.orchestra.run.vm03.stdout: kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T22:44:09.406 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T22:44:09.406 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T22:44:09.421 INFO:teuthology.orchestra.run.vm03.stdout:The following NEW packages will be installed: 2026-03-08T22:44:09.421 INFO:teuthology.orchestra.run.vm03.stdout: python3-jmespath python3-xmltodict 2026-03-08T22:44:09.891 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 2 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T22:44:09.891 INFO:teuthology.orchestra.run.vm03.stdout:Need to get 34.3 kB of archives. 2026-03-08T22:44:09.891 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 146 kB of additional disk space will be used. 2026-03-08T22:44:09.891 INFO:teuthology.orchestra.run.vm03.stdout:Get:1 https://archive.ubuntu.com/ubuntu jammy/main amd64 python3-jmespath all 0.10.0-1 [21.7 kB] 2026-03-08T22:44:10.114 INFO:teuthology.orchestra.run.vm03.stdout:Get:2 https://archive.ubuntu.com/ubuntu jammy/universe amd64 python3-xmltodict all 0.12.0-2 [12.6 kB] 2026-03-08T22:44:10.315 INFO:teuthology.orchestra.run.vm03.stdout:Fetched 34.3 kB in 1s (49.3 kB/s) 2026-03-08T22:44:10.330 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-jmespath. 2026-03-08T22:44:10.361 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118577 files and directories currently installed.) 2026-03-08T22:44:10.364 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../python3-jmespath_0.10.0-1_all.deb ... 2026-03-08T22:44:10.365 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-jmespath (0.10.0-1) ... 2026-03-08T22:44:10.382 INFO:teuthology.orchestra.run.vm03.stdout:Selecting previously unselected package python3-xmltodict. 2026-03-08T22:44:10.389 INFO:teuthology.orchestra.run.vm03.stdout:Preparing to unpack .../python3-xmltodict_0.12.0-2_all.deb ... 2026-03-08T22:44:10.390 INFO:teuthology.orchestra.run.vm03.stdout:Unpacking python3-xmltodict (0.12.0-2) ... 2026-03-08T22:44:10.419 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-xmltodict (0.12.0-2) ... 2026-03-08T22:44:10.493 INFO:teuthology.orchestra.run.vm03.stdout:Setting up python3-jmespath (0.10.0-1) ... 2026-03-08T22:44:10.888 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.888 INFO:teuthology.orchestra.run.vm03.stdout:Running kernel seems to be up-to-date. 2026-03-08T22:44:10.888 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.888 INFO:teuthology.orchestra.run.vm03.stdout:Services to be restarted: 2026-03-08T22:44:10.891 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart packagekit.service 2026-03-08T22:44:10.894 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.894 INFO:teuthology.orchestra.run.vm03.stdout:Service restarts being deferred: 2026-03-08T22:44:10.894 INFO:teuthology.orchestra.run.vm03.stdout: systemctl restart unattended-upgrades.service 2026-03-08T22:44:10.894 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.894 INFO:teuthology.orchestra.run.vm03.stdout:No containers need to be restarted. 2026-03-08T22:44:10.895 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.895 INFO:teuthology.orchestra.run.vm03.stdout:No user sessions are running outdated binaries. 2026-03-08T22:44:10.895 INFO:teuthology.orchestra.run.vm03.stdout: 2026-03-08T22:44:10.895 INFO:teuthology.orchestra.run.vm03.stdout:No VM guests are running outdated hypervisor (qemu) binaries on this host. 2026-03-08T22:44:12.012 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T22:44:12.016 DEBUG:teuthology.parallel:result is None 2026-03-08T22:44:12.016 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=ubuntu%2F22.04%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-08T22:44:12.628 DEBUG:teuthology.orchestra.run.vm03:> dpkg-query -W -f '${Version}' ceph 2026-03-08T22:44:12.637 INFO:teuthology.orchestra.run.vm03.stdout:19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:44:12.638 INFO:teuthology.packaging:The installed version of ceph is 19.2.3-678-ge911bdeb-1jammy 2026-03-08T22:44:12.638 INFO:teuthology.task.install:The correct ceph version 19.2.3-678-ge911bdeb-1jammy is installed. 2026-03-08T22:44:12.639 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-08T22:44:12.639 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:44:12.639 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-08T22:44:12.688 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-08T22:44:12.688 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:44:12.688 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/daemon-helper 2026-03-08T22:44:12.739 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-08T22:44:12.790 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-08T22:44:12.790 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:44:12.790 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-08T22:44:12.844 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-08T22:44:12.894 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-08T22:44:12.894 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:44:12.894 DEBUG:teuthology.orchestra.run.vm03:> sudo dd of=/usr/bin/stdin-killer 2026-03-08T22:44:12.948 DEBUG:teuthology.orchestra.run.vm03:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-08T22:44:12.999 INFO:teuthology.run_tasks:Running task workunit... 2026-03-08T22:44:13.004 INFO:tasks.workunit:Pulling workunits from ref 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:44:13.004 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-08T22:44:13.004 INFO:tasks.workunit:timeout=3h 2026-03-08T22:44:13.004 INFO:tasks.workunit:cleanup=True 2026-03-08T22:44:13.004 DEBUG:teuthology.orchestra.run.vm03:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:44:13.046 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T22:44:13.046 INFO:teuthology.orchestra.run.vm03.stderr:stat: cannot statx '/home/ubuntu/cephtest/mnt.0': No such file or directory 2026-03-08T22:44:13.046 DEBUG:teuthology.orchestra.run.vm03:> mkdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T22:44:13.094 INFO:tasks.workunit:Created dir /home/ubuntu/cephtest/mnt.0 2026-03-08T22:44:13.094 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/mnt.0 && mkdir -- client.0 2026-03-08T22:44:13.138 DEBUG:teuthology.orchestra.run.vm03:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 569c3e99c9b32a51b4eaf08731c728f4513ed589 2026-03-08T22:44:13.183 INFO:tasks.workunit.client.0.vm03.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-08T22:45:04.603 INFO:tasks.workunit.client.0.vm03.stderr:Note: switching to '569c3e99c9b32a51b4eaf08731c728f4513ed589'. 2026-03-08T22:45:04.603 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.603 INFO:tasks.workunit.client.0.vm03.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:state without impacting any branches by switching back to a branch. 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: git switch -c 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:Or undo this operation with: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: git switch - 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr: 2026-03-08T22:45:04.604 INFO:tasks.workunit.client.0.vm03.stderr:HEAD is now at 569c3e99c9b qa/rgw: bucket notifications use pynose 2026-03-08T22:45:04.610 DEBUG:teuthology.orchestra.run.vm03:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/standalone && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-08T22:45:04.654 DEBUG:teuthology.orchestra.run.vm03:> set -ex 2026-03-08T22:45:04.654 DEBUG:teuthology.orchestra.run.vm03:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-08T22:45:04.696 INFO:tasks.workunit:Running workunits matching scrub on client.0... 2026-03-08T22:45:04.697 INFO:tasks.workunit:Running workunit scrub/osd-mapper.sh... 2026-03-08T22:45:04.697 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-mapper.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh 2026-03-08T22:45:04.745 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-mapper 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:11: run: local dir=td/osd-mapper 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:12: run: shift 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:14: run: export CEPH_MON=127.0.0.1:7144 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:14: run: CEPH_MON=127.0.0.1:7144 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:15: run: export CEPH_ARGS 2026-03-08T22:45:04.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:16: run: uuidgen 2026-03-08T22:45:04.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:16: run: CEPH_ARGS+='--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none ' 2026-03-08T22:45:04.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:17: run: CEPH_ARGS+='--mon-host=127.0.0.1:7144 ' 2026-03-08T22:45:04.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:19: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T22:45:04.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:20: run: set 2026-03-08T22:45:04.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:20: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:20: run: local funcs=TEST_truncated_sna_record 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:21: run: for func in $funcs 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:22: run: setup td/osd-mapper 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-mapper 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-mapper 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-mapper 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:45:04.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-mapper KILL 2026-03-08T22:45:04.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:45:04.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:45:04.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:45:04.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:45:04.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:45:04.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:45:04.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:45:04.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:45:04.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:45:04.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:45:04.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:45:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:45:04.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:45:04.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:45:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:45:04.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:45:04.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:45:04.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:45:04.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-mapper 2026-03-08T22:45:04.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:45:04.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:45:04.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:45:04.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19288 2026-03-08T22:45:04.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:45:04.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:45:04.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-mapper 2026-03-08T22:45:04.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:45:04.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:45:04.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:45:04.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.19288 2026-03-08T22:45:04.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stdout:Dir: td/osd-mapper 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-mapper 1' TERM HUP INT 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:23: run: TEST_truncated_sna_record td/osd-mapper 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:45: TEST_truncated_sna_record: local dir=td/osd-mapper 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:46: TEST_truncated_sna_record: cluster_conf=(['osds_num']='3' ['pgs_in_pool']='4' ['pool_name']='test') 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:46: TEST_truncated_sna_record: local -A cluster_conf 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:52: TEST_truncated_sna_record: local extr_dbg=3 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:53: TEST_truncated_sna_record: (( extr_dbg > 1 )) 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:53: TEST_truncated_sna_record: echo 'Dir: td/osd-mapper' 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:54: TEST_truncated_sna_record: standard_scrub_cluster td/osd-mapper cluster_conf 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:229: standard_scrub_cluster: local dir=td/osd-mapper 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:230: standard_scrub_cluster: local -n args=cluster_conf 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:232: standard_scrub_cluster: local OSDS=3 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:233: standard_scrub_cluster: local pg_num=4 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:234: standard_scrub_cluster: local poolname=test 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:235: standard_scrub_cluster: args['pool_name']=test 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:236: standard_scrub_cluster: local extra_pars= 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:237: standard_scrub_cluster: local debug_msg=dbg 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:240: standard_scrub_cluster: local saved_echo_flag=x 2026-03-08T22:45:04.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:241: standard_scrub_cluster: set +x 2026-03-08T22:45:05.110 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 8e6d8e4f-cfbc-413a-a7f2-e75511379b1b 2026-03-08T22:45:05.236 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:45:05.264 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:05.252+0000 7f8e0d2d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:05.266 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:05.256+0000 7f8e0d2d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:05.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:05.256+0000 7f8e0d2d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:05.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:05.256+0000 7f8e0d2d58c0 -1 bdev(0x55f1d4f36c00 td/osd-mapper/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:45:05.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:05.256+0000 7f8e0d2d58c0 -1 bluestore(td/osd-mapper/0) _read_fsid unparsable uuid 2026-03-08T22:45:08.173 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:45:08.394 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:45:08.412 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:08.400+0000 7ff4420a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.413 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:08.400+0000 7ff4420a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:08.400+0000 7ff4420a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:08.576 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:45:09.744 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:45:09.860 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:09.848+0000 7ff4420a28c0 -1 Falling back to public interface 2026-03-08T22:45:10.870 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:10.856+0000 7ff4420a28c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:45:10.916 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:45:12.099 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:45:13.279 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T22:45:14.494 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T22:45:14.655 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1544427489,v1:127.0.0.1:6803/1544427489] [v2:127.0.0.1:6804/1544427489,v1:127.0.0.1:6805/1544427489] exists,up 8e6d8e4f-cfbc-413a-a7f2-e75511379b1b 2026-03-08T22:45:14.658 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 ef20a359-e037-453d-8899-a3608ce06625 2026-03-08T22:45:14.826 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:45:14.856 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:14.844+0000 7f9eb73b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:14.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:14.844+0000 7f9eb73b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:14.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:14.848+0000 7f9eb73b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:14.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:14.848+0000 7f9eb73b88c0 -1 bdev(0x55fcb5927c00 td/osd-mapper/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:45:14.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:14.848+0000 7f9eb73b88c0 -1 bluestore(td/osd-mapper/1) _read_fsid unparsable uuid 2026-03-08T22:45:17.493 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:45:17.707 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:45:17.728 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:17.716+0000 7f41135318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:17.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:17.720+0000 7f41135318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:17.732 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:17.720+0000 7f41135318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:17.881 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:45:18.663 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:18.652+0000 7f41135318c0 -1 Falling back to public interface 2026-03-08T22:45:19.051 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:45:19.627 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:19.616+0000 7f41135318c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:45:20.218 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:45:21.413 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:45:21.579 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/388761672,v1:127.0.0.1:6811/388761672] [v2:127.0.0.1:6812/388761672,v1:127.0.0.1:6813/388761672] exists,up ef20a359-e037-453d-8899-a3608ce06625 2026-03-08T22:45:21.582 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 faaad73c-6057-403a-a68b-13ff60daef71 2026-03-08T22:45:21.765 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:45:21.796 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:21.784+0000 7f2b51c518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:21.798 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:21.784+0000 7f2b51c518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:21.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:21.788+0000 7f2b51c518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:21.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:21.788+0000 7f2b51c518c0 -1 bdev(0x55f479573c00 td/osd-mapper/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:45:21.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:21.788+0000 7f2b51c518c0 -1 bluestore(td/osd-mapper/2) _read_fsid unparsable uuid 2026-03-08T22:45:24.349 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:45:24.538 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:45:24.554 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:24.540+0000 7fed6851a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:24.555 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:24.544+0000 7fed6851a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:24.556 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:24.544+0000 7fed6851a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:45:24.713 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:45:24.999 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:24.988+0000 7fed6851a8c0 -1 Falling back to public interface 2026-03-08T22:45:25.888 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:45:25.974 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:25.960+0000 7fed6851a8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:45:27.076 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:45:27.246 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/191920459,v1:127.0.0.1:6819/191920459] [v2:127.0.0.1:6820/191920459,v1:127.0.0.1:6821/191920459] exists,up faaad73c-6057-403a-a68b-13ff60daef71 2026-03-08T22:45:27.468 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T22:45:28.933 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836484 2026-03-08T22:45:30.277 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:45:30.443 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:45:32.312 INFO:tasks.workunit.client.0.vm03.stdout:standard_scrub_cluster: dbg: test pool is test 1 2026-03-08T22:45:32.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:55: TEST_truncated_sna_record: ceph tell 'osd.*' config set osd_stats_update_period_not_scrubbing 1 2026-03-08T22:45:32.379 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T22:45:32.379 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' osd_stats_update_period_not_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.379 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.386 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T22:45:32.386 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' osd_stats_update_period_not_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.386 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.394 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T22:45:32.394 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' osd_stats_update_period_not_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.394 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:56: TEST_truncated_sna_record: ceph tell 'osd.*' config set osd_stats_update_period_scrubbing 1 2026-03-08T22:45:32.468 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T22:45:32.468 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_stats_update_period_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.468 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.475 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T22:45:32.475 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_stats_update_period_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.475 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.483 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T22:45:32.483 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_stats_update_period_scrubbing = '' (not observed, change may require restart) " 2026-03-08T22:45:32.483 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:58: TEST_truncated_sna_record: local osdn=3 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:59: TEST_truncated_sna_record: local poolid=1 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:60: TEST_truncated_sna_record: local poolname=test 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:61: TEST_truncated_sna_record: local objname=objxxx 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:64: TEST_truncated_sna_record: make_a_clone test objxxx snap01 snap02 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:32: make_a_clone: local saved_echo_flag=x 2026-03-08T22:45:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:33: make_a_clone: set +x 2026-03-08T22:45:32.559 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap01 2026-03-08T22:45:32.661 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap02 2026-03-08T22:45:32.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:65: TEST_truncated_sna_record: make_a_clone test objxxx snap13 2026-03-08T22:45:32.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:32: make_a_clone: local saved_echo_flag=x 2026-03-08T22:45:32.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:33: make_a_clone: set +x 2026-03-08T22:45:32.766 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap13 2026-03-08T22:45:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:66: TEST_truncated_sna_record: make_a_clone test objxxx snap24 snap25 2026-03-08T22:45:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:32: make_a_clone: local saved_echo_flag=x 2026-03-08T22:45:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:33: make_a_clone: set +x 2026-03-08T22:45:32.871 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap24 2026-03-08T22:45:32.976 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap25 2026-03-08T22:45:32.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:67: TEST_truncated_sna_record: rados -p test put objxxx - 2026-03-08T22:45:32.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:67: TEST_truncated_sna_record: echo 20986 2026-03-08T22:45:33.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:70: TEST_truncated_sna_record: ceph --format=json-pretty osd map test objxxx 2026-03-08T22:45:33.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:70: TEST_truncated_sna_record: jq -r .pgid 2026-03-08T22:45:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:70: TEST_truncated_sna_record: local pgid=1.3 2026-03-08T22:45:33.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:71: TEST_truncated_sna_record: ceph --format=json-pretty osd map test objxxx 2026-03-08T22:45:33.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:71: TEST_truncated_sna_record: jq -r '.up[0]' 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stdout:pgid is 1.3 (primary: osd.1) 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:71: TEST_truncated_sna_record: local osd=1 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:72: TEST_truncated_sna_record: echo 'pgid is 1.3 (primary: osd.1)' 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:74: TEST_truncated_sna_record: set_query_debug 1.3 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:303: set_query_debug: local pgid=1.3 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: ceph pg dump pgs_brief 2026-03-08T22:45:33.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: awk -v 'pg=^1.3' -n -e '$0 ~ pg { print(gensub(/[^0-9]*([0-9]+).*/,"\\1","g",$5)); }' 2026-03-08T22:45:33.487 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs_brief 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stdout:Setting scrub debug data. Primary for 1.3 is 1 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: local prim_osd=1 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:307: set_query_debug: echo 'Setting scrub debug data. Primary for 1.3 is 1' 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: get_asok_path osd.1 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:45:33.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.19288/ceph-osd.1.asok 2026-03-08T22:45:33.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: CEPH_ARGS= 2026-03-08T22:45:33.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: ceph --format=json daemon /tmp/ceph-asok.19288/ceph-osd.1.asok scrubdebug 1.3 set sessions 2026-03-08T22:45:33.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:77: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:33.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:77: TEST_truncated_sna_record: rados --format json-pretty -p test listsnaps objxxx 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout:{"success":true}{ 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: "name": "objxxx", 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 5, 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: "id": 2, 2026-03-08T22:45:33.592 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 1, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap01" 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 2, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap02" 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 3, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 3, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap13" 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "size": 5, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 5, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 4, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap24" 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": 5, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap25" 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "id": "head", 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [], 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T22:45:33.593 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:33.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:80: TEST_truncated_sna_record: ceph pg 1.3 deep-scrub 2026-03-08T22:45:33.659 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:45:33.659 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:45:33.659 INFO:tasks.workunit.client.0.vm03.stdout: "must": true, 2026-03-08T22:45:33.659 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "0.000000" 2026-03-08T22:45:33.660 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:45:33.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:84: TEST_truncated_sna_record: sleep 3 2026-03-08T22:45:36.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:85: TEST_truncated_sna_record: ceph pg dump pgs 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574908+0000 0'0 20:16 [1,2,0] 1 [1,2,0] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574702+0000 0'0 25:30 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.769667+0000 0'0 25:28 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574852+0000 0'0 20:16 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:45:36.819 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:45:36.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:86: TEST_truncated_sna_record: grep -a -q -- 'event: --^^^^---- ScrubFinished' td/osd-mapper/osd.1.log 2026-03-08T22:45:36.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:90: TEST_truncated_sna_record: ceph pg dump pgs 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:1.3 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574908+0000 0'0 20:16 [1,2,0] 1 [1,2,0] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574702+0000 0'0 25:30 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.769667+0000 0'0 25:28 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:45:27.574852+0000 0'0 20:16 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:45:36.983 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:45:36.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:91: TEST_truncated_sna_record: ceph osd set noscrub 2026-03-08T22:45:37.403 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T22:45:37.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:92: TEST_truncated_sna_record: ceph osd set nodeep-scrub 2026-03-08T22:45:37.642 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T22:45:37.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:93: TEST_truncated_sna_record: sleep 5 2026-03-08T22:45:42.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:94: TEST_truncated_sna_record: grep -a -q -v ERR td/osd-mapper/osd.1.log 2026-03-08T22:45:42.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:97: TEST_truncated_sna_record: kill_daemons td/osd-mapper TERM osd 2026-03-08T22:45:42.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:45:42.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:45:42.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:45:42.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:45:42.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:45:42.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:45:42.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:99: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:42.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:99: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 dump p 2026-03-08T22:45:42.782 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:42.772+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:42.783 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:42.772+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:42.783 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:42.772+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) close 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/0/block size 100 GiB 2026-03-08T22:45:43.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount 2026-03-08T22:45:43.063 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluefs mount shared_bdev_used = 0 2026-03-08T22:45:43.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.052+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_super_meta old nid_max 2051 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_super_meta old blobid_max 20480 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_super_meta min_alloc_size 0x1000 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 freelist init 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 freelist _read_cfg 2026-03-08T22:45:43.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _init_alloc loaded 100 GiB in 19 extents, allocator type hybrid, capacity 0x1900000000, block size 0x1000, free 0x18fffc2000, fragmentation 6.86647e-07 2026-03-08T22:45:43.094 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bluefs umount 2026-03-08T22:45:43.094 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.080+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) close 2026-03-08T22:45:43.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:43.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:43.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/0/block size 100 GiB 2026-03-08T22:45:43.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount 2026-03-08T22:45:43.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:43.344 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.344 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.344 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:43.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluefs mount shared_bdev_used = 27459584 2026-03-08T22:45:43.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.332+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:43.367 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.356+0000 7f59017f1bc0 1 bluestore(td/osd-mapper/0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01.osd_superblock 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000000 0b 05 58 02 00 00 b7 56 c6 8a 4c 87 4c c4 be 7a |..X....V..L.L..z| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000010 64 0e 38 94 f7 03 00 00 00 00 1b 00 00 00 00 00 |d.8.............| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:* 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 fe ff 03 00 00 00 00 00 11 00 |................| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000050 00 00 01 00 00 00 00 00 00 00 1a 00 00 00 69 6e |..............in| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000060 69 74 69 61 6c 20 66 65 61 74 75 72 65 20 73 65 |itial feature se| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000070 74 28 7e 76 2e 31 38 29 02 00 00 00 00 00 00 00 |t(~v.18)........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000080 0d 00 00 00 70 67 69 6e 66 6f 20 6f 62 6a 65 63 |....pginfo objec| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000090 74 03 00 00 00 00 00 00 00 0e 00 00 00 6f 62 6a |t............obj| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000a0 65 63 74 20 6c 6f 63 61 74 6f 72 04 00 00 00 00 |ect locator.....| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000b0 00 00 00 10 00 00 00 6c 61 73 74 5f 65 70 6f 63 |.......last_epoc| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000c0 68 5f 63 6c 65 61 6e 05 00 00 00 00 00 00 00 0a |h_clean.........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000d0 00 00 00 63 61 74 65 67 6f 72 69 65 73 06 00 00 |...categories...| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000e0 00 00 00 00 00 0b 00 00 00 68 6f 62 6a 65 63 74 |.........hobject| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000000f0 70 6f 6f 6c 07 00 00 00 00 00 00 00 07 00 00 00 |pool............| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000100 62 69 67 69 6e 66 6f 08 00 00 00 00 00 00 00 0b |biginfo.........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000110 00 00 00 6c 65 76 65 6c 64 62 69 6e 66 6f 09 00 |...leveldbinfo..| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000120 00 00 00 00 00 00 0a 00 00 00 6c 65 76 65 6c 64 |..........leveld| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000130 62 6c 6f 67 0a 00 00 00 00 00 00 00 0a 00 00 00 |blog............| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000140 73 6e 61 70 6d 61 70 70 65 72 0b 00 00 00 00 00 |snapmapper......| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000150 00 00 0f 00 00 00 73 68 61 72 64 65 64 20 6f 62 |......sharded ob| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000160 6a 65 63 74 73 0c 00 00 00 00 00 00 00 11 00 00 |jects...........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000170 00 74 72 61 6e 73 61 63 74 69 6f 6e 20 68 69 6e |.transaction hin| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000180 74 73 0d 00 00 00 00 00 00 00 0e 00 00 00 70 67 |ts............pg| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000190 20 6d 65 74 61 20 6f 62 6a 65 63 74 0e 00 00 00 | meta object....| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001a0 00 00 00 00 14 00 00 00 65 78 70 6c 69 63 69 74 |........explicit| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001b0 20 6d 69 73 73 69 6e 67 20 73 65 74 0f 00 00 00 | missing set....| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001c0 00 00 00 00 10 00 00 00 66 61 73 74 69 6e 66 6f |........fastinfo| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001d0 20 70 67 20 61 74 74 72 10 00 00 00 00 00 00 00 | pg attr........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001e0 16 00 00 00 64 65 6c 65 74 65 73 20 69 6e 20 6d |....deletes in m| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:000001f0 69 73 73 69 6e 67 20 73 65 74 11 00 00 00 00 00 |issing set......| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000200 00 00 1c 00 00 00 6e 65 77 20 73 6e 61 70 6d 61 |......new snapma| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000210 70 70 65 72 20 6b 65 79 20 73 74 72 75 63 74 75 |pper key structu| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000220 72 65 1b 00 00 00 05 00 00 00 8e 6d 8e 4f cf bc |re.........m.O..| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000230 41 3a a7 f2 e7 55 11 37 9b 1b 00 00 00 00 00 00 |A:...U.7........| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000240 00 00 1b 00 00 00 f9 fb ad 69 be d2 2c 23 01 00 |.........i..,#..| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000250 00 00 01 00 00 00 01 00 00 00 1b 00 00 00 |..............| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:0000025e 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01~ 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:43.368 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 02 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 01 00 00 00 00 00 00 00 02 |................| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 39 00 00 00 04 03 27 00 00 00 00 00 00 00 |..9.....'.......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 03 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 |...............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003f 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 05 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 05 |................| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 01 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 02 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 03 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 03 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 04 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 05 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02~ 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c1%a3%fcn%00%00%00%00%00%00%04%03~ 2026-03-08T22:45:43.369 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:43.370 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.356+0000 7f59017f1bc0 1 bluefs umount 2026-03-08T22:45:43.370 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.356+0000 7f59017f1bc0 1 bdev(0x561720eb9c00 td/osd-mapper/0/block) close 2026-03-08T22:45:43.622 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.612+0000 7f59017f1bc0 1 freelist shutdown 2026-03-08T22:45:43.622 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.612+0000 7f59017f1bc0 1 bdev(0x561720eb9800 td/osd-mapper/0/block) close 2026-03-08T22:45:43.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:100: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:43.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:100: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 dump p 2026-03-08T22:45:43.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:100: TEST_truncated_sna_record: grep -a SNA_ 2026-03-08T22:45:43.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.844+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:43.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.848+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:43.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:43.848+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) close 2026-03-08T22:45:44.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:44.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:44.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-08T22:45:44.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:44.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:44.143 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/2/block size 100 GiB 2026-03-08T22:45:44.143 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount 2026-03-08T22:45:44.143 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluefs mount shared_bdev_used = 0 2026-03-08T22:45:44.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.132+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_super_meta old nid_max 2051 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_super_meta old blobid_max 20480 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_super_meta min_alloc_size 0x1000 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 freelist init 2026-03-08T22:45:44.172 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 freelist _read_cfg 2026-03-08T22:45:44.173 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _init_alloc loaded 100 GiB in 12 extents, allocator type hybrid, capacity 0x1900000000, block size 0x1000, free 0x18fffc2000, fragmentation 4.19618e-07 2026-03-08T22:45:44.173 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bluefs umount 2026-03-08T22:45:44.173 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.160+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) close 2026-03-08T22:45:44.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:44.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:44.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/2/block size 100 GiB 2026-03-08T22:45:44.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount 2026-03-08T22:45:44.423 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:44.424 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluefs mount shared_bdev_used = 27459584 2026-03-08T22:45:44.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.412+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:44.449 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.436+0000 7f80ec0d5bc0 1 bluestore(td/osd-mapper/2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:44.450 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.440+0000 7f80ec0d5bc0 1 bluefs umount 2026-03-08T22:45:44.450 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.440+0000 7f80ec0d5bc0 1 bdev(0x55cc9047fc00 td/osd-mapper/2/block) close 2026-03-08T22:45:44.702 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.692+0000 7f80ec0d5bc0 1 freelist shutdown 2026-03-08T22:45:44.702 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:44.692+0000 7f80ec0d5bc0 1 bdev(0x55cc9047f800 td/osd-mapper/2/block) close 2026-03-08T22:45:44.926 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:44.926 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:101: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:101: TEST_truncated_sna_record: grep -a SNA_ /tmp/oo2.dump 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stderr:grep: /tmp/oo2.dump: No such file or directory 2026-03-08T22:45:44.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:102: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:44.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:102: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 dump p 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01.osd_superblock 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000000 0b 05 58 02 00 00 b7 56 c6 8a 4c 87 4c c4 be 7a |..X....V..L.L..z| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000010 64 0e 38 94 f7 03 02 00 00 00 1b 00 00 00 00 00 |d.8.............| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:* 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 fe ff 03 00 00 00 00 00 11 00 |................| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000050 00 00 01 00 00 00 00 00 00 00 1a 00 00 00 69 6e |..............in| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000060 69 74 69 61 6c 20 66 65 61 74 75 72 65 20 73 65 |itial feature se| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000070 74 28 7e 76 2e 31 38 29 02 00 00 00 00 00 00 00 |t(~v.18)........| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000080 0d 00 00 00 70 67 69 6e 66 6f 20 6f 62 6a 65 63 |....pginfo objec| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000090 74 03 00 00 00 00 00 00 00 0e 00 00 00 6f 62 6a |t............obj| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000a0 65 63 74 20 6c 6f 63 61 74 6f 72 04 00 00 00 00 |ect locator.....| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000b0 00 00 00 10 00 00 00 6c 61 73 74 5f 65 70 6f 63 |.......last_epoc| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000c0 68 5f 63 6c 65 61 6e 05 00 00 00 00 00 00 00 0a |h_clean.........| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000d0 00 00 00 63 61 74 65 67 6f 72 69 65 73 06 00 00 |...categories...| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000e0 00 00 00 00 00 0b 00 00 00 68 6f 62 6a 65 63 74 |.........hobject| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000000f0 70 6f 6f 6c 07 00 00 00 00 00 00 00 07 00 00 00 |pool............| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000100 62 69 67 69 6e 66 6f 08 00 00 00 00 00 00 00 0b |biginfo.........| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000110 00 00 00 6c 65 76 65 6c 64 62 69 6e 66 6f 09 00 |...leveldbinfo..| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000120 00 00 00 00 00 00 0a 00 00 00 6c 65 76 65 6c 64 |..........leveld| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000130 62 6c 6f 67 0a 00 00 00 00 00 00 00 0a 00 00 00 |blog............| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000140 73 6e 61 70 6d 61 70 70 65 72 0b 00 00 00 00 00 |snapmapper......| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000150 00 00 0f 00 00 00 73 68 61 72 64 65 64 20 6f 62 |......sharded ob| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000160 6a 65 63 74 73 0c 00 00 00 00 00 00 00 11 00 00 |jects...........| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000170 00 74 72 61 6e 73 61 63 74 69 6f 6e 20 68 69 6e |.transaction hin| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000180 74 73 0d 00 00 00 00 00 00 00 0e 00 00 00 70 67 |ts............pg| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000190 20 6d 65 74 61 20 6f 62 6a 65 63 74 0e 00 00 00 | meta object....| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001a0 00 00 00 00 14 00 00 00 65 78 70 6c 69 63 69 74 |........explicit| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001b0 20 6d 69 73 73 69 6e 67 20 73 65 74 0f 00 00 00 | missing set....| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001c0 00 00 00 00 10 00 00 00 66 61 73 74 69 6e 66 6f |........fastinfo| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001d0 20 70 67 20 61 74 74 72 10 00 00 00 00 00 00 00 | pg attr........| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001e0 16 00 00 00 64 65 6c 65 74 65 73 20 69 6e 20 6d |....deletes in m| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:000001f0 69 73 73 69 6e 67 20 73 65 74 11 00 00 00 00 00 |issing set......| 2026-03-08T22:45:45.527 INFO:tasks.workunit.client.0.vm03.stdout:00000200 00 00 1c 00 00 00 6e 65 77 20 73 6e 61 70 6d 61 |......new snapma| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000210 70 70 65 72 20 6b 65 79 20 73 74 72 75 63 74 75 |pper key structu| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000220 72 65 1b 00 00 00 0f 00 00 00 fa aa d7 3c 60 57 |re...........<`W| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000230 40 3a a6 8b 13 ff 60 da ef 71 00 00 00 00 00 00 |@:....`..q......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000240 00 00 1b 00 00 00 07 fc ad 69 4f 25 e9 04 01 00 |.........iO%....| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000250 00 00 01 00 00 00 01 00 00 00 1b 00 00 00 |..............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000025e 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01~ 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 02 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 01 00 00 00 00 00 00 00 02 |................| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 39 00 00 00 04 03 27 00 00 00 00 00 00 00 |..9.....'.......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 03 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 |...............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000003f 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 05 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 05 |................| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 01 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 02 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 03 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 03 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 04 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:45.528 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 05 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02~ 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c1%a3%fcn%00%00%00%00%00%00%04%03~ 2026-03-08T22:45:45.529 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:104: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 dump p 2026-03-08T22:45:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:104: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:45:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:104: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_000000000000000[0-9]_000000000000000' 2026-03-08T22:45:46.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:104: TEST_truncated_sna_record: wc -l 2026-03-08T22:45:47.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:104: TEST_truncated_sna_record: local num_sna_b4=5 2026-03-08T22:45:47.087 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:106: TEST_truncated_sna_record: expr 3 - 1 2026-03-08T22:45:47.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:106: TEST_truncated_sna_record: seq 0 2 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stdout:corrupting the SnapMapper DB of osd.0 (db: td/osd-mapper/0) 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:106: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:108: TEST_truncated_sna_record: kvdir=td/osd-mapper/0 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:109: TEST_truncated_sna_record: echo 'corrupting the SnapMapper DB of osd.0 (db: td/osd-mapper/0)' 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: (( extr_dbg >= 3 )) 2026-03-08T22:45:47.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 dump p 2026-03-08T22:45:47.100 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.088+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:47.101 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.088+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:47.101 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.088+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) close 2026-03-08T22:45:47.386 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:47.386 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/0/block size 100 GiB 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount 2026-03-08T22:45:47.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluefs mount shared_bdev_used = 0 2026-03-08T22:45:47.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.376+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_super_meta old nid_max 2051 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_super_meta old blobid_max 20480 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_super_meta min_alloc_size 0x1000 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 freelist init 2026-03-08T22:45:47.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 freelist _read_cfg 2026-03-08T22:45:47.416 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _init_alloc loaded 100 GiB in 19 extents, allocator type hybrid, capacity 0x1900000000, block size 0x1000, free 0x18fffc2000, fragmentation 6.86647e-07 2026-03-08T22:45:47.416 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bluefs umount 2026-03-08T22:45:47.416 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.404+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) close 2026-03-08T22:45:47.666 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) open path td/osd-mapper/0/block 2026-03-08T22:45:47.666 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:47.666 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/0/block size 100 GiB 2026-03-08T22:45:47.666 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount 2026-03-08T22:45:47.667 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluefs mount shared_bdev_used = 27459584 2026-03-08T22:45:47.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.656+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:47.692 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.680+0000 7f8e62634bc0 1 bluestore(td/osd-mapper/0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01.osd_superblock 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000000 0b 05 58 02 00 00 b7 56 c6 8a 4c 87 4c c4 be 7a |..X....V..L.L..z| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000010 64 0e 38 94 f7 03 00 00 00 00 1b 00 00 00 00 00 |d.8.............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:* 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 fe ff 03 00 00 00 00 00 11 00 |................| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000050 00 00 01 00 00 00 00 00 00 00 1a 00 00 00 69 6e |..............in| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000060 69 74 69 61 6c 20 66 65 61 74 75 72 65 20 73 65 |itial feature se| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000070 74 28 7e 76 2e 31 38 29 02 00 00 00 00 00 00 00 |t(~v.18)........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000080 0d 00 00 00 70 67 69 6e 66 6f 20 6f 62 6a 65 63 |....pginfo objec| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000090 74 03 00 00 00 00 00 00 00 0e 00 00 00 6f 62 6a |t............obj| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000a0 65 63 74 20 6c 6f 63 61 74 6f 72 04 00 00 00 00 |ect locator.....| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000b0 00 00 00 10 00 00 00 6c 61 73 74 5f 65 70 6f 63 |.......last_epoc| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000c0 68 5f 63 6c 65 61 6e 05 00 00 00 00 00 00 00 0a |h_clean.........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000d0 00 00 00 63 61 74 65 67 6f 72 69 65 73 06 00 00 |...categories...| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000e0 00 00 00 00 00 0b 00 00 00 68 6f 62 6a 65 63 74 |.........hobject| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000000f0 70 6f 6f 6c 07 00 00 00 00 00 00 00 07 00 00 00 |pool............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000100 62 69 67 69 6e 66 6f 08 00 00 00 00 00 00 00 0b |biginfo.........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000110 00 00 00 6c 65 76 65 6c 64 62 69 6e 66 6f 09 00 |...leveldbinfo..| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000120 00 00 00 00 00 00 0a 00 00 00 6c 65 76 65 6c 64 |..........leveld| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000130 62 6c 6f 67 0a 00 00 00 00 00 00 00 0a 00 00 00 |blog............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000140 73 6e 61 70 6d 61 70 70 65 72 0b 00 00 00 00 00 |snapmapper......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000150 00 00 0f 00 00 00 73 68 61 72 64 65 64 20 6f 62 |......sharded ob| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000160 6a 65 63 74 73 0c 00 00 00 00 00 00 00 11 00 00 |jects...........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000170 00 74 72 61 6e 73 61 63 74 69 6f 6e 20 68 69 6e |.transaction hin| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000180 74 73 0d 00 00 00 00 00 00 00 0e 00 00 00 70 67 |ts............pg| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000190 20 6d 65 74 61 20 6f 62 6a 65 63 74 0e 00 00 00 | meta object....| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001a0 00 00 00 00 14 00 00 00 65 78 70 6c 69 63 69 74 |........explicit| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001b0 20 6d 69 73 73 69 6e 67 20 73 65 74 0f 00 00 00 | missing set....| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001c0 00 00 00 00 10 00 00 00 66 61 73 74 69 6e 66 6f |........fastinfo| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001d0 20 70 67 20 61 74 74 72 10 00 00 00 00 00 00 00 | pg attr........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001e0 16 00 00 00 64 65 6c 65 74 65 73 20 69 6e 20 6d |....deletes in m| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:000001f0 69 73 73 69 6e 67 20 73 65 74 11 00 00 00 00 00 |issing set......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000200 00 00 1c 00 00 00 6e 65 77 20 73 6e 61 70 6d 61 |......new snapma| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000210 70 70 65 72 20 6b 65 79 20 73 74 72 75 63 74 75 |pper key structu| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000220 72 65 1b 00 00 00 05 00 00 00 8e 6d 8e 4f cf bc |re.........m.O..| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000230 41 3a a7 f2 e7 55 11 37 9b 1b 00 00 00 00 00 00 |A:...U.7........| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000240 00 00 1b 00 00 00 f9 fb ad 69 be d2 2c 23 01 00 |.........i..,#..| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000250 00 00 01 00 00 00 01 00 00 00 1b 00 00 00 |..............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:0000025e 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01~ 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 02 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 01 00 00 00 00 00 00 00 02 |................| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 39 00 00 00 04 03 27 00 00 00 00 00 00 00 |..9.....'.......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 03 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 |...............| 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:0000003f 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.693 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 05 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 05 |................| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 01 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 02 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 03 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 03 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 04 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 05 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02~ 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c1%a3%fcn%00%00%00%00%00%00%04%03~ 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.680+0000 7f8e62634bc0 1 bluefs umount 2026-03-08T22:45:47.694 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.680+0000 7f8e62634bc0 1 bdev(0x559e57913c00 td/osd-mapper/0/block) close 2026-03-08T22:45:47.946 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.936+0000 7f8e62634bc0 1 freelist shutdown 2026-03-08T22:45:47.946 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:47.936+0000 7f8e62634bc0 1 bdev(0x559e57913800 td/osd-mapper/0/block) close 2026-03-08T22:45:48.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 dump p 2026-03-08T22:45:48.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:45:48.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_0000000000000003_000000000000000' 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: KY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: echo 'SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..' 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: cat -v 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stdout:SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:49.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: mktemp -p /tmp --suffix=_the_val 2026-03-08T22:45:49.248 INFO:tasks.workunit.client.0.vm03.stdout:Value dumped in: /tmp/tmp.om6w8iM2B9_the_val 2026-03-08T22:45:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: tmp_fn1=/tmp/tmp.om6w8iM2B9_the_val 2026-03-08T22:45:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: echo 'Value dumped in: /tmp/tmp.om6w8iM2B9_the_val' 2026-03-08T22:45:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:119: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 get p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. out /tmp/tmp.om6w8iM2B9_the_val 2026-03-08T22:45:49.846 INFO:tasks.workunit.client.0.vm03.stdout:(p, %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..) 2026-03-08T22:45:50.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:50.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: od -xc /tmp/tmp.om6w8iM2B9_the_val 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout:0000000 0101 0035 0000 0003 0000 0000 0000 0304 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout: 001 001 5 \0 \0 \0 003 \0 \0 \0 \0 \0 \0 \0 004 003 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout:0000020 0027 0000 0000 0000 0006 0000 626f 786a 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout: ' \0 \0 \0 \0 \0 \0 \0 006 \0 \0 \0 o b j x 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout:0000040 7878 0003 0000 0000 0000 75af 85d0 0000 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout: x x 003 \0 \0 \0 \0 \0 \0 \0 257 u 320 205 \0 \0 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout:0000060 0000 0100 0000 0000 0000 0000 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout: \0 \0 \0 001 \0 \0 \0 \0 \0 \0 \0 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stdout:0000073 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:122: TEST_truncated_sna_record: NKY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 2026-03-08T22:45:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:123: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 rm p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:51.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:124: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/0 set p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 in /tmp/tmp.om6w8iM2B9_the_val 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stdout:corrupting the SnapMapper DB of osd.1 (db: td/osd-mapper/1) 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:126: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:106: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:108: TEST_truncated_sna_record: kvdir=td/osd-mapper/1 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:109: TEST_truncated_sna_record: echo 'corrupting the SnapMapper DB of osd.1 (db: td/osd-mapper/1)' 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: (( extr_dbg >= 3 )) 2026-03-08T22:45:52.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 dump p 2026-03-08T22:45:52.509 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.496+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) open path td/osd-mapper/1/block 2026-03-08T22:45:52.509 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.496+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:52.509 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.496+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) close 2026-03-08T22:45:52.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) open path td/osd-mapper/1/block 2026-03-08T22:45:52.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) open path td/osd-mapper/1/block 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/1/block size 100 GiB 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount 2026-03-08T22:45:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluefs mount shared_bdev_used = 0 2026-03-08T22:45:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.780+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_super_meta old nid_max 2051 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_super_meta old blobid_max 20480 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_super_meta min_alloc_size 0x1000 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 freelist init 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 freelist _read_cfg 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _init_alloc loaded 100 GiB in 15 extents, allocator type hybrid, capacity 0x1900000000, block size 0x1000, free 0x18fffc2000, fragmentation 5.34059e-07 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bluefs umount 2026-03-08T22:45:52.821 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:52.808+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) close 2026-03-08T22:45:53.070 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) open path td/osd-mapper/1/block 2026-03-08T22:45:53.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:53.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/1/block size 100 GiB 2026-03-08T22:45:53.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount 2026-03-08T22:45:53.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluefs mount shared_bdev_used = 27459584 2026-03-08T22:45:53.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.060+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.080+0000 7fdf313f6bc0 1 bluestore(td/osd-mapper/1) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01.osd_superblock 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stdout:00000000 0b 05 58 02 00 00 b7 56 c6 8a 4c 87 4c c4 be 7a |..X....V..L.L..z| 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stdout:00000010 64 0e 38 94 f7 03 01 00 00 00 1b 00 00 00 00 00 |d.8.............| 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 2026-03-08T22:45:53.094 INFO:tasks.workunit.client.0.vm03.stdout:* 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 fe ff 03 00 00 00 00 00 11 00 |................| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000050 00 00 01 00 00 00 00 00 00 00 1a 00 00 00 69 6e |..............in| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000060 69 74 69 61 6c 20 66 65 61 74 75 72 65 20 73 65 |itial feature se| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000070 74 28 7e 76 2e 31 38 29 02 00 00 00 00 00 00 00 |t(~v.18)........| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000080 0d 00 00 00 70 67 69 6e 66 6f 20 6f 62 6a 65 63 |....pginfo objec| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000090 74 03 00 00 00 00 00 00 00 0e 00 00 00 6f 62 6a |t............obj| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000a0 65 63 74 20 6c 6f 63 61 74 6f 72 04 00 00 00 00 |ect locator.....| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000b0 00 00 00 10 00 00 00 6c 61 73 74 5f 65 70 6f 63 |.......last_epoc| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000c0 68 5f 63 6c 65 61 6e 05 00 00 00 00 00 00 00 0a |h_clean.........| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000d0 00 00 00 63 61 74 65 67 6f 72 69 65 73 06 00 00 |...categories...| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000e0 00 00 00 00 00 0b 00 00 00 68 6f 62 6a 65 63 74 |.........hobject| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000000f0 70 6f 6f 6c 07 00 00 00 00 00 00 00 07 00 00 00 |pool............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000100 62 69 67 69 6e 66 6f 08 00 00 00 00 00 00 00 0b |biginfo.........| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000110 00 00 00 6c 65 76 65 6c 64 62 69 6e 66 6f 09 00 |...leveldbinfo..| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000120 00 00 00 00 00 00 0a 00 00 00 6c 65 76 65 6c 64 |..........leveld| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000130 62 6c 6f 67 0a 00 00 00 00 00 00 00 0a 00 00 00 |blog............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000140 73 6e 61 70 6d 61 70 70 65 72 0b 00 00 00 00 00 |snapmapper......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000150 00 00 0f 00 00 00 73 68 61 72 64 65 64 20 6f 62 |......sharded ob| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000160 6a 65 63 74 73 0c 00 00 00 00 00 00 00 11 00 00 |jects...........| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000170 00 74 72 61 6e 73 61 63 74 69 6f 6e 20 68 69 6e |.transaction hin| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000180 74 73 0d 00 00 00 00 00 00 00 0e 00 00 00 70 67 |ts............pg| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000190 20 6d 65 74 61 20 6f 62 6a 65 63 74 0e 00 00 00 | meta object....| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001a0 00 00 00 00 14 00 00 00 65 78 70 6c 69 63 69 74 |........explicit| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001b0 20 6d 69 73 73 69 6e 67 20 73 65 74 0f 00 00 00 | missing set....| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001c0 00 00 00 00 10 00 00 00 66 61 73 74 69 6e 66 6f |........fastinfo| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001d0 20 70 67 20 61 74 74 72 10 00 00 00 00 00 00 00 | pg attr........| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001e0 16 00 00 00 64 65 6c 65 74 65 73 20 69 6e 20 6d |....deletes in m| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:000001f0 69 73 73 69 6e 67 20 73 65 74 11 00 00 00 00 00 |issing set......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000200 00 00 1c 00 00 00 6e 65 77 20 73 6e 61 70 6d 61 |......new snapma| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000210 70 70 65 72 20 6b 65 79 20 73 74 72 75 63 74 75 |pper key structu| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000220 72 65 1b 00 00 00 0a 00 00 00 ef 20 a3 59 e0 37 |re......... .Y.7| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000230 45 3d 88 99 a3 60 8c e0 66 25 00 00 00 00 00 00 |E=...`..f%......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000240 00 00 1b 00 00 00 00 fc ad 69 89 28 e0 2f 01 00 |.........i.(./..| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000250 00 00 01 00 00 00 01 00 00 00 1b 00 00 00 |..............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:0000025e 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01~ 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 02 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 01 00 00 00 00 00 00 00 02 |................| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 39 00 00 00 04 03 27 00 00 00 00 00 00 00 |..9.....'.......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 03 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 |...............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:0000003f 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 05 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 05 |................| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 01 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:53.095 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 02 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 03 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 03 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 04 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 05 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02~ 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c1%a3%fcn%00%00%00%00%00%00%04%03~ 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.084+0000 7fdf313f6bc0 1 bluefs umount 2026-03-08T22:45:53.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.084+0000 7fdf313f6bc0 1 bdev(0x559483ddfc00 td/osd-mapper/1/block) close 2026-03-08T22:45:53.350 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.340+0000 7fdf313f6bc0 1 freelist shutdown 2026-03-08T22:45:53.350 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:53.340+0000 7fdf313f6bc0 1 bdev(0x559483ddf800 td/osd-mapper/1/block) close 2026-03-08T22:45:53.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 dump p 2026-03-08T22:45:53.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:45:53.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_0000000000000003_000000000000000' 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: KY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: echo 'SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..' 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: cat -v 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stdout:SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:54.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: mktemp -p /tmp --suffix=_the_val 2026-03-08T22:45:54.657 INFO:tasks.workunit.client.0.vm03.stdout:Value dumped in: /tmp/tmp.1aV8D2UoRs_the_val 2026-03-08T22:45:54.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: tmp_fn1=/tmp/tmp.1aV8D2UoRs_the_val 2026-03-08T22:45:54.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:54.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: echo 'Value dumped in: /tmp/tmp.1aV8D2UoRs_the_val' 2026-03-08T22:45:54.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:119: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 get p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. out /tmp/tmp.1aV8D2UoRs_the_val 2026-03-08T22:45:55.258 INFO:tasks.workunit.client.0.vm03.stdout:(p, %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..) 2026-03-08T22:45:55.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: od -xc /tmp/tmp.1aV8D2UoRs_the_val 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout:0000000 0101 0035 0000 0003 0000 0000 0000 0304 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout: 001 001 5 \0 \0 \0 003 \0 \0 \0 \0 \0 \0 \0 004 003 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout:0000020 0027 0000 0000 0000 0006 0000 626f 786a 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout: ' \0 \0 \0 \0 \0 \0 \0 006 \0 \0 \0 o b j x 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout:0000040 7878 0003 0000 0000 0000 75af 85d0 0000 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout: x x 003 \0 \0 \0 \0 \0 \0 \0 257 u 320 205 \0 \0 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout:0000060 0000 0100 0000 0000 0000 0000 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout: \0 \0 \0 001 \0 \0 \0 \0 \0 \0 \0 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stdout:0000073 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:122: TEST_truncated_sna_record: NKY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 2026-03-08T22:45:55.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:123: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 rm p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:56.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:124: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 set p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 in /tmp/tmp.1aV8D2UoRs_the_val 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stdout:corrupting the SnapMapper DB of osd.2 (db: td/osd-mapper/2) 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:126: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:106: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:108: TEST_truncated_sna_record: kvdir=td/osd-mapper/2 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:109: TEST_truncated_sna_record: echo 'corrupting the SnapMapper DB of osd.2 (db: td/osd-mapper/2)' 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: (( extr_dbg >= 3 )) 2026-03-08T22:45:58.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:110: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 dump p 2026-03-08T22:45:58.179 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.168+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:58.179 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.168+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:58.179 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.168+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) close 2026-03-08T22:45:58.454 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/2/block size 100 GiB 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bluefs mount 2026-03-08T22:45:58.455 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.444+0000 7f90bc7fdbc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluefs mount shared_bdev_used = 0 2026-03-08T22:45:58.458 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.448+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_super_meta old nid_max 2051 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_super_meta old blobid_max 20480 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_super_meta ondisk_format 4 compat_ondisk_format 3 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_super_meta min_alloc_size 0x1000 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 freelist init 2026-03-08T22:45:58.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 freelist _read_cfg 2026-03-08T22:45:58.489 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _init_alloc loaded 100 GiB in 12 extents, allocator type hybrid, capacity 0x1900000000, block size 0x1000, free 0x18fffc2000, fragmentation 4.19618e-07 2026-03-08T22:45:58.489 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bluefs umount 2026-03-08T22:45:58.489 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.476+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) close 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) open path td/osd-mapper/2/block 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) open size 107374182400 (0x1900000000, 100 GiB) block_size 4096 (4 KiB) rotational device, discard supported 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs add_block_device bdev 1 path td/osd-mapper/2/block size 100 GiB 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount 2026-03-08T22:45:58.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs _init_alloc shared, id 1, capacity 0x1900000000, block size 0x10000 2026-03-08T22:45:58.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0> 2026-03-08T22:45:58.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluefs mount shared_bdev_used = 27459584 2026-03-08T22:45:58.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.724+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _prepare_db_environment set db_paths to db,102005473280 db.slow,102005473280 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.748+0000 7f90bc7fdbc0 1 bluestore(td/osd-mapper/2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01.osd_superblock 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000000 0b 05 58 02 00 00 b7 56 c6 8a 4c 87 4c c4 be 7a |..X....V..L.L..z| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000010 64 0e 38 94 f7 03 02 00 00 00 1b 00 00 00 00 00 |d.8.............| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 |................| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:* 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 fe ff 03 00 00 00 00 00 11 00 |................| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000050 00 00 01 00 00 00 00 00 00 00 1a 00 00 00 69 6e |..............in| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000060 69 74 69 61 6c 20 66 65 61 74 75 72 65 20 73 65 |itial feature se| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000070 74 28 7e 76 2e 31 38 29 02 00 00 00 00 00 00 00 |t(~v.18)........| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000080 0d 00 00 00 70 67 69 6e 66 6f 20 6f 62 6a 65 63 |....pginfo objec| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000090 74 03 00 00 00 00 00 00 00 0e 00 00 00 6f 62 6a |t............obj| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000a0 65 63 74 20 6c 6f 63 61 74 6f 72 04 00 00 00 00 |ect locator.....| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000b0 00 00 00 10 00 00 00 6c 61 73 74 5f 65 70 6f 63 |.......last_epoc| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000c0 68 5f 63 6c 65 61 6e 05 00 00 00 00 00 00 00 0a |h_clean.........| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000d0 00 00 00 63 61 74 65 67 6f 72 69 65 73 06 00 00 |...categories...| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000e0 00 00 00 00 00 0b 00 00 00 68 6f 62 6a 65 63 74 |.........hobject| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:000000f0 70 6f 6f 6c 07 00 00 00 00 00 00 00 07 00 00 00 |pool............| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000100 62 69 67 69 6e 66 6f 08 00 00 00 00 00 00 00 0b |biginfo.........| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000110 00 00 00 6c 65 76 65 6c 64 62 69 6e 66 6f 09 00 |...leveldbinfo..| 2026-03-08T22:45:58.761 INFO:tasks.workunit.client.0.vm03.stdout:00000120 00 00 00 00 00 00 0a 00 00 00 6c 65 76 65 6c 64 |..........leveld| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000130 62 6c 6f 67 0a 00 00 00 00 00 00 00 0a 00 00 00 |blog............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000140 73 6e 61 70 6d 61 70 70 65 72 0b 00 00 00 00 00 |snapmapper......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000150 00 00 0f 00 00 00 73 68 61 72 64 65 64 20 6f 62 |......sharded ob| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000160 6a 65 63 74 73 0c 00 00 00 00 00 00 00 11 00 00 |jects...........| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000170 00 74 72 61 6e 73 61 63 74 69 6f 6e 20 68 69 6e |.transaction hin| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000180 74 73 0d 00 00 00 00 00 00 00 0e 00 00 00 70 67 |ts............pg| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000190 20 6d 65 74 61 20 6f 62 6a 65 63 74 0e 00 00 00 | meta object....| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001a0 00 00 00 00 14 00 00 00 65 78 70 6c 69 63 69 74 |........explicit| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001b0 20 6d 69 73 73 69 6e 67 20 73 65 74 0f 00 00 00 | missing set....| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001c0 00 00 00 00 10 00 00 00 66 61 73 74 69 6e 66 6f |........fastinfo| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001d0 20 70 67 20 61 74 74 72 10 00 00 00 00 00 00 00 | pg attr........| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001e0 16 00 00 00 64 65 6c 65 74 65 73 20 69 6e 20 6d |....deletes in m| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:000001f0 69 73 73 69 6e 67 20 73 65 74 11 00 00 00 00 00 |issing set......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000200 00 00 1c 00 00 00 6e 65 77 20 73 6e 61 70 6d 61 |......new snapma| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000210 70 70 65 72 20 6b 65 79 20 73 74 72 75 63 74 75 |pper key structu| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000220 72 65 1b 00 00 00 0f 00 00 00 fa aa d7 3c 60 57 |re...........<`W| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000230 40 3a a6 8b 13 ff 60 da ef 71 00 00 00 00 00 00 |@:....`..q......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000240 00 00 1b 00 00 00 07 fc ad 69 4f 25 e9 04 01 00 |.........iO%....| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000250 00 00 01 00 00 00 01 00 00 00 1b 00 00 00 |..............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:0000025e 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%7b%3fC%c4%00%00%00%00%00%00%00%01~ 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 02 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 01 00 00 00 00 00 00 00 02 |................| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 39 00 00 00 04 03 27 00 00 00 00 00 00 00 |..9.....'.......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 03 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 03 00 00 00 00 00 00 00 |...............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:0000003f 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.OBJ_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 41 00 00 00 04 03 27 00 00 00 00 00 00 00 |..A.....'.......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000010 06 00 00 00 6f 62 6a 78 78 78 05 00 00 00 00 00 |....objxxx......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000020 00 00 af 75 d0 85 00 00 00 00 00 01 00 00 00 00 |...u............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 02 00 00 00 04 00 00 00 00 00 00 00 05 |................| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000040 00 00 00 00 00 00 00 |.......| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000047 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 01 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:58.762 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 02 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 02 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 03 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 03 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 04 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000000 01 01 35 00 00 00 05 00 00 00 00 00 00 00 04 03 |..5.............| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000010 27 00 00 00 00 00 00 00 06 00 00 00 6f 62 6a 78 |'...........objx| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000020 78 78 05 00 00 00 00 00 00 00 af 75 d0 85 00 00 |xx.........u....| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:00000030 00 00 00 01 00 00 00 00 00 00 00 |...........| 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:0000003b 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02~ 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c1%a3%fcn%00%00%00%00%00%00%04%03~ 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:45:58.763 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.748+0000 7f90bc7fdbc0 1 bluefs umount 2026-03-08T22:45:58.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:58.748+0000 7f90bc7fdbc0 1 bdev(0x55bce8075c00 td/osd-mapper/2/block) close 2026-03-08T22:45:59.014 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:59.004+0000 7f90bc7fdbc0 1 freelist shutdown 2026-03-08T22:45:59.014 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:45:59.004+0000 7f90bc7fdbc0 1 bdev(0x55bce8075800 td/osd-mapper/2/block) close 2026-03-08T22:45:59.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 dump p 2026-03-08T22:45:59.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:45:59.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_0000000000000003_000000000000000' 2026-03-08T22:46:00.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:114: TEST_truncated_sna_record: KY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:46:00.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:00.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: echo 'SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..' 2026-03-08T22:46:00.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:115: TEST_truncated_sna_record: cat -v 2026-03-08T22:46:00.319 INFO:tasks.workunit.client.0.vm03.stdout:SNA key: %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:46:00.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: mktemp -p /tmp --suffix=_the_val 2026-03-08T22:46:00.320 INFO:tasks.workunit.client.0.vm03.stdout:Value dumped in: /tmp/tmp.IY6cY4DMkJ_the_val 2026-03-08T22:46:00.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:117: TEST_truncated_sna_record: tmp_fn1=/tmp/tmp.IY6cY4DMkJ_the_val 2026-03-08T22:46:00.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:00.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:118: TEST_truncated_sna_record: echo 'Value dumped in: /tmp/tmp.IY6cY4DMkJ_the_val' 2026-03-08T22:46:00.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:119: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 get p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. out /tmp/tmp.IY6cY4DMkJ_the_val 2026-03-08T22:46:00.919 INFO:tasks.workunit.client.0.vm03.stdout:(p, %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx..) 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:120: TEST_truncated_sna_record: od -xc /tmp/tmp.IY6cY4DMkJ_the_val 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout:0000000 0101 0035 0000 0003 0000 0000 0000 0304 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout: 001 001 5 \0 \0 \0 003 \0 \0 \0 \0 \0 \0 \0 004 003 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout:0000020 0027 0000 0000 0000 0006 0000 626f 786a 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout: ' \0 \0 \0 \0 \0 \0 \0 006 \0 \0 \0 o b j x 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout:0000040 7878 0003 0000 0000 0000 75af 85d0 0000 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout: x x 003 \0 \0 \0 \0 \0 \0 \0 257 u 320 205 \0 \0 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout:0000060 0000 0100 0000 0000 0000 0000 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout: \0 \0 \0 001 \0 \0 \0 \0 \0 \0 \0 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stdout:0000073 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:122: TEST_truncated_sna_record: NKY=%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 2026-03-08T22:46:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:123: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 rm p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:46:02.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:124: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/2 set p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 in /tmp/tmp.IY6cY4DMkJ_the_val 2026-03-08T22:46:03.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:126: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:129: TEST_truncated_sna_record: orig_osd_args=' --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 ' 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:130: TEST_truncated_sna_record: echo --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stdout:Copied OSD args: / --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1/ /--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1/ 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:130: TEST_truncated_sna_record: orig_osd_args=' --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:131: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:131: TEST_truncated_sna_record: echo 'Copied OSD args: / --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1/ /--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1/' 2026-03-08T22:46:03.810 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:132: TEST_truncated_sna_record: expr 3 - 1 2026-03-08T22:46:03.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:132: TEST_truncated_sna_record: seq 0 2 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:132: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: CEPH_ARGS='--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: activate_osd td/osd-mapper 0 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-mapper 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-mapper/0 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-mapper/0' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-mapper/0/journal' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-mapper' 2026-03-08T22:46:03.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-mapper/$name.log' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-mapper/$name.pid' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-mapper/0 2026-03-08T22:46:03.814 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:46:03.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:46:03.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-mapper/0 --osd-journal=td/osd-mapper/0/journal --chdir= --run-dir=td/osd-mapper '--admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-mapper/$name.log' '--pid-file=td/osd-mapper/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:03.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-mapper/0/whoami 2026-03-08T22:46:03.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:46:03.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:46:03.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:46:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:46:03.835 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:03.824+0000 7f343c04e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:03.843 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:03.832+0000 7f343c04e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:03.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:03.832+0000 7f343c04e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:04.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:04.807 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:04.796+0000 7f343c04e8c0 -1 Falling back to public interface 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:05.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:05.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:05.789 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:05.776+0000 7f343c04e8c0 -1 osd.0 27 log_to_monitors true 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:06.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:06.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:06.746 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:06.732+0000 7f3432ffe640 -1 osd.0 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:07.732 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 30 up_thru 30 down_at 28 last_clean_interval [5,27) [v2:127.0.0.1:6802/2869127796,v1:127.0.0.1:6803/2869127796] [v2:127.0.0.1:6804/2869127796,v1:127.0.0.1:6805/2869127796] exists,up 8e6d8e4f-cfbc-413a-a7f2-e75511379b1b 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:132: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: CEPH_ARGS='--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: activate_osd td/osd-mapper 1 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-mapper 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-mapper/1 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-mapper/1' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-mapper/1/journal' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-mapper' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:07.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-mapper/$name.log' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-mapper/$name.pid' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:46:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-mapper/1 2026-03-08T22:46:07.735 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:46:07.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:46:07.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-mapper/1 --osd-journal=td/osd-mapper/1/journal --chdir= --run-dir=td/osd-mapper '--admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-mapper/$name.log' '--pid-file=td/osd-mapper/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:07.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-mapper/1/whoami 2026-03-08T22:46:07.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:46:07.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:46:07.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:46:07.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:46:07.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:07.748+0000 7f9252af88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:07.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:07.752+0000 7f9252af88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:07.766 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:07.752+0000 7f9252af88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:07.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:46:07.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:07.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:46:07.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:07.914 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:08.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:09.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:09.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:09.204+0000 7f9252af88c0 -1 Falling back to public interface 2026-03-08T22:46:09.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:10.191 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:10.180+0000 7f9252af88c0 -1 osd.1 27 log_to_monitors true 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:10.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:11.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 33 up_thru 33 down_at 28 last_clean_interval [10,27) [v2:127.0.0.1:6810/1162272461,v1:127.0.0.1:6811/1162272461] [v2:127.0.0.1:6812/1162272461,v1:127.0.0.1:6813/1162272461] exists,up ef20a359-e037-453d-8899-a3608ce06625 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:132: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: CEPH_ARGS='--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:134: TEST_truncated_sna_record: activate_osd td/osd-mapper 2 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-mapper 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-mapper/2 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-mapper/2' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-mapper/2/journal' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-mapper' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-mapper/$name.log' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-mapper/$name.pid' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:46:11.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-mapper/2 2026-03-08T22:46:11.633 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:46:11.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:46:11.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=b756c68a-4c87-4cc4-be7a-640e3894f703 --auth-supported=none --mon-host=127.0.0.1:7144 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_pool_default_pg_autoscale_mode=off --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-mapper/2 --osd-journal=td/osd-mapper/2/journal --chdir= --run-dir=td/osd-mapper '--admin-socket=/tmp/ceph-asok.19288/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-mapper/$name.log' '--pid-file=td/osd-mapper/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:46:11.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-mapper/2/whoami 2026-03-08T22:46:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:46:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:46:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:46:11.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:46:11.656 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:11.644+0000 7f88280be8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:11.663 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:11.652+0000 7f88280be8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:11.664 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:11.652+0000 7f88280be8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:11.823 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:11.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:12.856 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:12.844+0000 7f88280be8c0 -1 Falling back to public interface 2026-03-08T22:46:12.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:12.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:12.994 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:12.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:12.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:12.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:13.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:13.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:13.828+0000 7f88280be8c0 -1 osd.2 27 log_to_monitors true 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:14.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:14.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:15.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:15.714 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 36 up_thru 36 down_at 28 last_clean_interval [15,27) [v2:127.0.0.1:6818/4196414359,v1:127.0.0.1:6819/4196414359] [v2:127.0.0.1:6820/4196414359,v1:127.0.0.1:6821/4196414359] exists,up faaad73c-6057-403a-a68b-13ff60daef71 2026-03-08T22:46:15.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:15.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:15.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:15.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:136: TEST_truncated_sna_record: sleep 1 2026-03-08T22:46:16.716 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:138: TEST_truncated_sna_record: expr 3 - 1 2026-03-08T22:46:16.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:138: TEST_truncated_sna_record: seq 0 2 2026-03-08T22:46:16.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:138: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:16.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:140: TEST_truncated_sna_record: timeout 60 ceph tell osd.0 version 2026-03-08T22:46:16.790 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:16.790 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T22:46:16.790 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T22:46:16.790 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T22:46:16.790 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:16.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:138: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:16.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:140: TEST_truncated_sna_record: timeout 60 ceph tell osd.1 version 2026-03-08T22:46:16.869 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:16.869 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T22:46:16.869 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T22:46:16.869 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T22:46:16.869 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:16.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:138: TEST_truncated_sna_record: for sdn in $(seq 0 $(expr $osdn - 1)) 2026-03-08T22:46:16.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:140: TEST_truncated_sna_record: timeout 60 ceph tell osd.2 version 2026-03-08T22:46:16.947 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:16.947 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T22:46:16.947 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T22:46:16.947 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T22:46:16.947 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:16.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:142: TEST_truncated_sna_record: rados --format json-pretty -p test listsnaps objxxx 2026-03-08T22:46:16.974 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "name": "objxxx", 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 5, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 2, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 1, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap01" 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 2, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap02" 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 3, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 3, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap13" 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "size": 5, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 5, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [ 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 4, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap24" 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.022 INFO:tasks.workunit.client.0.vm03.stdout: "id": 5, 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "name": "snap25" 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6, 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "overlaps": [] 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "id": "head", 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "snapshots": [], 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: "size": 6 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:17.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:145: TEST_truncated_sna_record: ceph osd unset nodeep-scrub 2026-03-08T22:46:17.177 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-08T22:46:17.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:146: TEST_truncated_sna_record: ceph osd unset noscrub 2026-03-08T22:46:17.386 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:149: TEST_truncated_sna_record: ceph --format=json-pretty osd map test objxxx 2026-03-08T22:46:17.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:149: TEST_truncated_sna_record: jq -r '.up[0]' 2026-03-08T22:46:17.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:149: TEST_truncated_sna_record: local cur_prim=1 2026-03-08T22:46:17.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:150: TEST_truncated_sna_record: ceph pg dump pgs 2026-03-08T22:46:17.732 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout:1.3 4 0 4 0 0 23 0 0 7 0 7 active+undersized+degraded 2026-03-08T22:46:11.378183+0000 25'7 35:52 [1,0] 1 [1,0] 1 25'7 2026-03-08T22:45:34.601567+0000 25'7 2026-03-08T22:45:34.601567+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:34.601567+0000 4 0 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.435381+0000 0'0 37:37 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 undersized+peered 2026-03-08T22:46:06.898547+0000 0'0 32:11 [0] 0 [0] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+undersized 2026-03-08T22:46:11.377036+0000 0'0 35:19 [1,0] 1 [1,0] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:17.733 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:46:17.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:151: TEST_truncated_sna_record: sleep 2 2026-03-08T22:46:19.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:152: TEST_truncated_sna_record: ceph pg 1.3 deep-scrub 2026-03-08T22:46:19.817 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:19.817 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:46:19.817 INFO:tasks.workunit.client.0.vm03.stdout: "must": true, 2026-03-08T22:46:19.817 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "0.000000" 2026-03-08T22:46:19.817 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:19.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:153: TEST_truncated_sna_record: sleep 5 2026-03-08T22:46:24.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:154: TEST_truncated_sna_record: ceph pg dump pgs 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:1.3 4 0 0 0 0 23 0 0 7 0 7 active+clean 2026-03-08T22:46:20.273418+0000 25'7 39:75 [1,2,0] 1 [1,2,0] 1 25'7 2026-03-08T22:46:20.273377+0000 25'7 2026-03-08T22:46:20.273377+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:46:20.273377+0000 4 0 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.435381+0000 0'0 39:41 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.434395+0000 0'0 39:26 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.437043+0000 0'0 39:32 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:24.981 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:46:24.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:155: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:24.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:155: TEST_truncated_sna_record: grep -a ERR td/osd-mapper/osd.1.log 2026-03-08T22:46:24.997 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:46:20.268+0000 7f9237283640 -1 log_channel(cluster) log [ERR] : osd.1 found snap mapper error on pg 1.3 oid 1:f5ae0ba1:::objxxx:3 snaps in mapper: {}, oi: {3} ...repaired 2026-03-08T22:46:24.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:156: TEST_truncated_sna_record: grep -a -q ERR td/osd-mapper/osd.1.log 2026-03-08T22:46:24.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:160: TEST_truncated_sna_record: grep -a ERR td/osd-mapper/osd.1.log 2026-03-08T22:46:24.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:160: TEST_truncated_sna_record: wc -l 2026-03-08T22:46:25.001 INFO:tasks.workunit.client.0.vm03.stdout:prev count: 1 2026-03-08T22:46:25.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:160: TEST_truncated_sna_record: local prev_err_cnt=1 2026-03-08T22:46:25.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:161: TEST_truncated_sna_record: echo 'prev count: 1' 2026-03-08T22:46:25.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:164: TEST_truncated_sna_record: ceph pg 1.3 deep-scrub 2026-03-08T22:46:25.070 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:46:25.070 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:46:25.070 INFO:tasks.workunit.client.0.vm03.stdout: "must": true, 2026-03-08T22:46:25.070 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "0.000000" 2026-03-08T22:46:25.070 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:46:25.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:165: TEST_truncated_sna_record: sleep 5 2026-03-08T22:46:30.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:166: TEST_truncated_sna_record: ceph pg dump pgs 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:1.3 4 0 0 0 0 23 0 0 7 0 7 active+clean 2026-03-08T22:46:20.273418+0000 25'7 39:76 [1,2,0] 1 [1,2,0] 1 25'7 2026-03-08T22:46:20.273377+0000 25'7 2026-03-08T22:46:20.273377+0000 0 0 queued for deep scrub 4 0 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:1.2 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.435381+0000 0'0 39:41 [0,1,2] 0 [0,1,2] 0 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:1.1 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.434395+0000 0'0 39:26 [2,0,1] 2 [2,0,1] 2 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:15.437043+0000 0'0 39:32 [1,0,2] 1 [1,0,2] 1 0'0 2026-03-08T22:45:27.459983+0000 0'0 2026-03-08T22:45:27.459983+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:45:27.459983+0000 0 0 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:46:30.237 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:46:30.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:167: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:30.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:167: TEST_truncated_sna_record: grep -a ERR td/osd-mapper/osd.1.log 2026-03-08T22:46:30.250 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:46:20.268+0000 7f9237283640 -1 log_channel(cluster) log [ERR] : osd.1 found snap mapper error on pg 1.3 oid 1:f5ae0ba1:::objxxx:3 snaps in mapper: {}, oi: {3} ...repaired 2026-03-08T22:46:30.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:168: TEST_truncated_sna_record: grep -a ERR td/osd-mapper/osd.1.log 2026-03-08T22:46:30.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:168: TEST_truncated_sna_record: wc -l 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stdout:current count: 1 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:168: TEST_truncated_sna_record: local current_err_cnt=1 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:169: TEST_truncated_sna_record: (( extr_dbg >= 1 )) 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:169: TEST_truncated_sna_record: echo 'current count: 1' 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:170: TEST_truncated_sna_record: (( current_err_cnt == prev_err_cnt )) 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:171: TEST_truncated_sna_record: kill_daemons td/osd-mapper TERM osd 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:30.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:30.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:30.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:172: TEST_truncated_sna_record: kvdir=td/osd-mapper/1 2026-03-08T22:46:30.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:173: TEST_truncated_sna_record: (( extr_dbg >= 2 )) 2026-03-08T22:46:30.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:173: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 dump p 2026-03-08T22:46:30.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:174: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:46:30.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:173: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_' 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.FA570D58.2.objxxx.. 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_000000 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.FA570D58.3.objxxx.. 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stdout:%00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.FA570D58.5.objxxx.. 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:176: TEST_truncated_sna_record: ceph-kvstore-tool bluestore-kv td/osd-mapper/1 dump p 2026-03-08T22:46:31.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:176: TEST_truncated_sna_record: awk -e '{print $2;}' 2026-03-08T22:46:31.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:176: TEST_truncated_sna_record: grep -a -e 'SNA_[0-9]_000000000000000[0-9]_000000000000000' 2026-03-08T22:46:31.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:176: TEST_truncated_sna_record: wc -l 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:176: TEST_truncated_sna_record: local num_sna_full=5 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:177: TEST_truncated_sna_record: (( num_sna_full == num_sna_b4 )) 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:178: TEST_truncated_sna_record: return 0 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-mapper.sh:24: run: teardown td/osd-mapper 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-mapper 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-mapper KILL 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:32.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:32.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:32.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:32.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:32.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:46:32.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:32.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:32.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:32.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:46:32.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-mapper 2026-03-08T22:46:32.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:32.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:46:32.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19288 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-mapper 0 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-mapper 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-mapper KILL 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:32.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:32.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:32.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:32.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:32.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:32.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:32.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:32.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:32.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:46:32.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:32.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:32.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:32.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:32.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:32.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:46:32.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-mapper 2026-03-08T22:46:32.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:32.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.19288 2026-03-08T22:46:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.19288 2026-03-08T22:46:32.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:32.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:32.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:46:32.660 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:46:32.660 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:46:32.711 INFO:tasks.workunit:Running workunit scrub/osd-recovery-scrub.sh... 2026-03-08T22:46:32.711 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-recovery-scrub.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh 2026-03-08T22:46:32.762 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:46:32.766 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-recovery-scrub 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:20: run: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:21: run: shift 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:23: run: export CEPH_MON=127.0.0.1:7124 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:23: run: CEPH_MON=127.0.0.1:7124 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:24: run: export CEPH_ARGS 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:25: run: uuidgen 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:25: run: CEPH_ARGS+='--fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none ' 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:26: run: CEPH_ARGS+='--mon-host=127.0.0.1:7124 ' 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:27: run: CEPH_ARGS+='--osd-op-queue=wpq ' 2026-03-08T22:46:32.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:29: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:30: run: set 2026-03-08T22:46:32.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:30: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:30: run: local funcs=TEST_recovery_scrub_1 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:31: run: for func in $funcs 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:32: run: TEST_recovery_scrub_1 td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:39: TEST_recovery_scrub_1: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:40: TEST_recovery_scrub_1: local poolname=test 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:42: TEST_recovery_scrub_1: TESTDATA=testdata.34483 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:43: TEST_recovery_scrub_1: OSDS=4 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:44: TEST_recovery_scrub_1: PGS=1 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:45: TEST_recovery_scrub_1: OBJECTS=100 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:46: TEST_recovery_scrub_1: ERRORS=0 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:48: TEST_recovery_scrub_1: setup td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-recovery-scrub KILL 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:46:32.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:46:32.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:46:32.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:46:32.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:46:32.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:46:32.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:46:32.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:46:32.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:46:32.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:46:32.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:46:32.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:46:32.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-recovery-scrub 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:32.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.34483 2026-03-08T22:46:32.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:46:32.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:46:32.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-recovery-scrub 2026-03-08T22:46:32.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:46:32.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:32.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.34483 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-recovery-scrub 1' TERM HUP INT 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:49: TEST_recovery_scrub_1: run_mon td/osd-recovery-scrub a --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:46:32.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:46:32.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-recovery-scrub/a 2026-03-08T22:46:32.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-recovery-scrub/a --run-dir=td/osd-recovery-scrub --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T22:46:32.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:32.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-recovery-scrub/a '--log-file=td/osd-recovery-scrub/$name.log' '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --mon-cluster-log-file=td/osd-recovery-scrub/log --run-dir=td/osd-recovery-scrub '--pid-file=td/osd-recovery-scrub/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:46:32.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.34483/ceph-mon.a.asok 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:46:32.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.34483/ceph-mon.a.asok config get fsid 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:32.900 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:32.901 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.34483/ceph-mon.a.asok 2026-03-08T22:46:32.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:46:32.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.34483/ceph-mon.a.asok config get mon_host 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:50: TEST_recovery_scrub_1: run_mgr td/osd-recovery-scrub x 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-recovery-scrub 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-recovery-scrub/x 2026-03-08T22:46:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:46:33.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:46:33.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-recovery-scrub/x '--log-file=td/osd-recovery-scrub/$name.log' '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --run-dir=td/osd-recovery-scrub '--pid-file=td/osd-recovery-scrub/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:46:33.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:51: TEST_recovery_scrub_1: local 'ceph_osd_args=--osd-scrub-interval-randomize-ratio=0 ' 2026-03-08T22:46:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:52: TEST_recovery_scrub_1: ceph_osd_args+='--osd_scrub_backoff_ratio=0 ' 2026-03-08T22:46:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:53: TEST_recovery_scrub_1: ceph_osd_args+='--osd_stats_update_period_not_scrubbing=3 ' 2026-03-08T22:46:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:54: TEST_recovery_scrub_1: ceph_osd_args+=--osd_stats_update_period_scrubbing=2 2026-03-08T22:46:33.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: expr 4 - 1 2026-03-08T22:46:33.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: seq 0 3 2026-03-08T22:46:33.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:57: TEST_recovery_scrub_1: run_osd td/osd-recovery-scrub 0 --osd_scrub_during_recovery=false 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-recovery-scrub 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-recovery-scrub/0 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq ' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-recovery-scrub/0' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-recovery-scrub/0/journal' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-recovery-scrub' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:33.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-recovery-scrub/$name.log' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-recovery-scrub/$name.pid' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_scrub_during_recovery=false 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-recovery-scrub/0 2026-03-08T22:46:33.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:33.102 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 65ed2061-7b11-4740-a073-6611d0ec813e 2026-03-08T22:46:33.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=65ed2061-7b11-4740-a073-6611d0ec813e 2026-03-08T22:46:33.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 65ed2061-7b11-4740-a073-6611d0ec813e' 2026-03-08T22:46:33.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:33.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBJ/K1pPY9kBhAASpJsHVqVEi2cEJqPY3hRXQ== 2026-03-08T22:46:33.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBJ/K1pPY9kBhAASpJsHVqVEi2cEJqPY3hRXQ=="}' 2026-03-08T22:46:33.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 65ed2061-7b11-4740-a073-6611d0ec813e -i td/osd-recovery-scrub/0/new.json 2026-03-08T22:46:33.209 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:33.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-recovery-scrub/0/new.json 2026-03-08T22:46:33.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/0 --osd-journal=td/osd-recovery-scrub/0/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --mkfs --key AQBJ/K1pPY9kBhAASpJsHVqVEi2cEJqPY3hRXQ== --osd-uuid 65ed2061-7b11-4740-a073-6611d0ec813e 2026-03-08T22:46:33.238 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:33.224+0000 7f862fe058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:33.239 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:33.228+0000 7f862fe058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:33.241 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:33.228+0000 7f862fe058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:33.241 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:33.228+0000 7f862fe058c0 -1 bdev(0x558dec368c00 td/osd-recovery-scrub/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:33.241 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:33.228+0000 7f862fe058c0 -1 bluestore(td/osd-recovery-scrub/0) _read_fsid unparsable uuid 2026-03-08T22:46:35.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-recovery-scrub/0/keyring 2026-03-08T22:46:35.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:35.526 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:46:35.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:46:35.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-recovery-scrub/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:35.635 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:46:35.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:46:35.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:35.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:35.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:35.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/0 --osd-journal=td/osd-recovery-scrub/0/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false 2026-03-08T22:46:35.684 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:35.668+0000 7fb56290d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:35.690 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:35.676+0000 7fb56290d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:35.698 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:35.680+0000 7fb56290d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:35.781 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:35.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:35.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:36.652 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:36.640+0000 7fb56290d8c0 -1 Falling back to public interface 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:36.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:37.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:37.669 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:37.656+0000 7fb56290d8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:38.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2221065452,v1:127.0.0.1:6803/2221065452] [v2:127.0.0.1:6804/2221065452,v1:127.0.0.1:6805/2221065452] exists,up 65ed2061-7b11-4740-a073-6611d0ec813e 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:46:38.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:57: TEST_recovery_scrub_1: run_osd td/osd-recovery-scrub 1 --osd_scrub_during_recovery=false 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-recovery-scrub 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-recovery-scrub/1 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq ' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-recovery-scrub/1' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-recovery-scrub/1/journal' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-recovery-scrub' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:38.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-recovery-scrub/$name.log' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-recovery-scrub/$name.pid' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_scrub_during_recovery=false 2026-03-08T22:46:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-recovery-scrub/1 2026-03-08T22:46:38.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:38.297 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 7b545edc-33c3-408f-86f9-80e53756f0a1 2026-03-08T22:46:38.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7b545edc-33c3-408f-86f9-80e53756f0a1 2026-03-08T22:46:38.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 7b545edc-33c3-408f-86f9-80e53756f0a1' 2026-03-08T22:46:38.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:38.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBO/K1pulYEEhAASeZV/06PKy9ADUVjErD8zw== 2026-03-08T22:46:38.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBO/K1pulYEEhAASeZV/06PKy9ADUVjErD8zw=="}' 2026-03-08T22:46:38.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7b545edc-33c3-408f-86f9-80e53756f0a1 -i td/osd-recovery-scrub/1/new.json 2026-03-08T22:46:38.564 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:38.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-recovery-scrub/1/new.json 2026-03-08T22:46:38.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/1 --osd-journal=td/osd-recovery-scrub/1/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --mkfs --key AQBO/K1pulYEEhAASeZV/06PKy9ADUVjErD8zw== --osd-uuid 7b545edc-33c3-408f-86f9-80e53756f0a1 2026-03-08T22:46:38.595 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:38.584+0000 7f65773d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.597 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:38.584+0000 7f65773d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.598 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:38.588+0000 7f65773d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:38.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:38.588+0000 7f65773d98c0 -1 bdev(0x563dcabf7c00 td/osd-recovery-scrub/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:38.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:38.588+0000 7f65773d98c0 -1 bluestore(td/osd-recovery-scrub/1) _read_fsid unparsable uuid 2026-03-08T22:46:41.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-recovery-scrub/1/keyring 2026-03-08T22:46:41.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:41.364 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:46:41.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:46:41.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-recovery-scrub/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:41.584 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:46:41.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:46:41.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/1 --osd-journal=td/osd-recovery-scrub/1/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false 2026-03-08T22:46:41.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:41.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:41.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:41.601 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:41.588+0000 7fc9adbf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.602 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:41.588+0000 7fc9adbf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.603 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:41.592+0000 7fc9adbf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:41.767 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:41.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:41.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:42.803 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:42.792+0000 7fc9adbf88c0 -1 Falling back to public interface 2026-03-08T22:46:42.941 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:42.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:42.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:42.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:42.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:42.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:43.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:44.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:44.166 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:44.156+0000 7fc9adbf88c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:46:44.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:45.320 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:46:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:46:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3140935254,v1:127.0.0.1:6811/3140935254] [v2:127.0.0.1:6812/3140935254,v1:127.0.0.1:6813/3140935254] exists,up 7b545edc-33c3-408f-86f9-80e53756f0a1 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:57: TEST_recovery_scrub_1: run_osd td/osd-recovery-scrub 2 --osd_scrub_during_recovery=false 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-recovery-scrub 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-recovery-scrub/2 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq ' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-recovery-scrub/2' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-recovery-scrub/2/journal' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-recovery-scrub' 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:45.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-recovery-scrub/$name.log' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-recovery-scrub/$name.pid' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_scrub_during_recovery=false 2026-03-08T22:46:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-recovery-scrub/2 2026-03-08T22:46:45.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:45.491 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 1ce5aa6b-b2c2-4127-a32e-de531057d8ad 2026-03-08T22:46:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1ce5aa6b-b2c2-4127-a32e-de531057d8ad 2026-03-08T22:46:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 1ce5aa6b-b2c2-4127-a32e-de531057d8ad' 2026-03-08T22:46:45.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:45.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBV/K1p0TCRHRAA2I5TDs0M6r+RLnI24FvvWg== 2026-03-08T22:46:45.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBV/K1p0TCRHRAA2I5TDs0M6r+RLnI24FvvWg=="}' 2026-03-08T22:46:45.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1ce5aa6b-b2c2-4127-a32e-de531057d8ad -i td/osd-recovery-scrub/2/new.json 2026-03-08T22:46:45.665 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:45.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-recovery-scrub/2/new.json 2026-03-08T22:46:45.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/2 --osd-journal=td/osd-recovery-scrub/2/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --mkfs --key AQBV/K1p0TCRHRAA2I5TDs0M6r+RLnI24FvvWg== --osd-uuid 1ce5aa6b-b2c2-4127-a32e-de531057d8ad 2026-03-08T22:46:45.695 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:45.684+0000 7ff2c69648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.697 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:45.684+0000 7ff2c69648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.698 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:45.684+0000 7ff2c69648c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:45.698 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:45.688+0000 7ff2c69648c0 -1 bdev(0x5573363f9c00 td/osd-recovery-scrub/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:45.698 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:45.688+0000 7ff2c69648c0 -1 bluestore(td/osd-recovery-scrub/2) _read_fsid unparsable uuid 2026-03-08T22:46:48.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-recovery-scrub/2/keyring 2026-03-08T22:46:48.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:48.013 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:46:48.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:46:48.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-recovery-scrub/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:48.237 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:46:48.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:46:48.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/2 --osd-journal=td/osd-recovery-scrub/2/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false 2026-03-08T22:46:48.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:48.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:48.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:48.254 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:48.240+0000 7f11d66f68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.262 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:48.252+0000 7f11d66f68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.264 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:48.252+0000 7f11d66f68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:48.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:48.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:49.459 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:49.448+0000 7f11d66f68c0 -1 Falling back to public interface 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:49.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:49.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:50.438 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:50.428+0000 7f11d66f68c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:46:50.957 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3199541307,v1:127.0.0.1:6819/3199541307] [v2:127.0.0.1:6820/3199541307,v1:127.0.0.1:6821/3199541307] exists,up 1ce5aa6b-b2c2-4127-a32e-de531057d8ad 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:55: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:57: TEST_recovery_scrub_1: run_osd td/osd-recovery-scrub 3 --osd_scrub_during_recovery=false 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-recovery-scrub 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-recovery-scrub/3 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq ' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-recovery-scrub/3' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-recovery-scrub/3/journal' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:46:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-recovery-scrub' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-recovery-scrub/$name.log' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-recovery-scrub/$name.pid' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_scrub_during_recovery=false 2026-03-08T22:46:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-recovery-scrub/3 2026-03-08T22:46:50.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:46:50.961 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98 2026-03-08T22:46:50.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98 2026-03-08T22:46:50.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98' 2026-03-08T22:46:50.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:46:50.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBa/K1pf06OORAA7O6gLN2EnRZZ+dmWV/yOjA== 2026-03-08T22:46:50.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBa/K1pf06OORAA7O6gLN2EnRZZ+dmWV/yOjA=="}' 2026-03-08T22:46:50.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98 -i td/osd-recovery-scrub/3/new.json 2026-03-08T22:46:51.136 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:46:51.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-recovery-scrub/3/new.json 2026-03-08T22:46:51.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/3 --osd-journal=td/osd-recovery-scrub/3/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --mkfs --key AQBa/K1pf06OORAA7O6gLN2EnRZZ+dmWV/yOjA== --osd-uuid 95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98 2026-03-08T22:46:51.166 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:51.156+0000 7ff9f0c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:51.168 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:51.156+0000 7ff9f0c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:51.169 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:51.156+0000 7ff9f0c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:51.169 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:51.156+0000 7ff9f0c968c0 -1 bdev(0x55c9c1af1c00 td/osd-recovery-scrub/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:46:51.169 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:51.156+0000 7ff9f0c968c0 -1 bluestore(td/osd-recovery-scrub/3) _read_fsid unparsable uuid 2026-03-08T22:46:53.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-recovery-scrub/3/keyring 2026-03-08T22:46:53.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:46:53.924 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T22:46:53.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:46:53.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-recovery-scrub/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:46:54.132 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T22:46:54.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:46:54.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=81eb1ecf-828d-4553-a6e6-740051b84f8a --auth-supported=none --mon-host=127.0.0.1:7124 --osd-op-queue=wpq --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-recovery-scrub/3 --osd-journal=td/osd-recovery-scrub/3/journal --chdir= --run-dir=td/osd-recovery-scrub '--admin-socket=/tmp/ceph-asok.34483/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-recovery-scrub/$name.log' '--pid-file=td/osd-recovery-scrub/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false 2026-03-08T22:46:54.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:46:54.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:46:54.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:46:54.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:54.136+0000 7f39afbe88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:54.151 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:54.140+0000 7f39afbe88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:54.153 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:54.140+0000 7f39afbe88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:54.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:54.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:55.351 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:55.340+0000 7f39afbe88c0 -1 Falling back to public interface 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:55.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:55.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:46:56.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:46:56.336+0000 7f39afbe88c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:46:56.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:46:56.829 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 19 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3277653057,v1:127.0.0.1:6827/3277653057] [v2:127.0.0.1:6828/3277653057,v1:127.0.0.1:6829/3277653057] exists,up 95e8acdc-9cf1-4f00-bfd7-1a5ab633eb98 2026-03-08T22:46:56.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:46:56.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:46:56.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:46:56.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:61: TEST_recovery_scrub_1: create_pool test 1 1 2026-03-08T22:46:56.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T22:46:57.048 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T22:46:57.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:46:58.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:62: TEST_recovery_scrub_1: wait_for_clean 2026-03-08T22:46:58.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:46:58.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:46:58.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:46:58.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:46:58.069 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:46:58.069 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:46:58.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:46:58.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:46:58.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:46:58.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:58.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:46:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:46:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:46:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:46:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:58.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:46:58.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T22:46:58.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T22:46:58.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T22:46:58.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:58.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:46:58.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542147 2026-03-08T22:46:58.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542147 2026-03-08T22:46:58.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964 2-60129542147' 2026-03-08T22:46:58.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:46:58.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=81604378626 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 81604378626 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964 2-60129542147 3-81604378626' 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:46:58.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:46:58.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:46:58.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:46:58.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:46:58.671 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:46:58.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:46:58.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:46:58.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:46:58.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T22:46:58.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:46:59.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:46:59.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:47:00.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836485 2026-03-08T22:47:00.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:00.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T22:47:00.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:00.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:47:00.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T22:47:00.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:00.047 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T22:47:00.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T22:47:00.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T22:47:00.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:47:00.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T22:47:00.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:00.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542147 2026-03-08T22:47:00.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:00.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:47:00.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542147 2026-03-08T22:47:00.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:00.230 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542147 2026-03-08T22:47:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542147 2026-03-08T22:47:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542147' 2026-03-08T22:47:00.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:47:00.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542147 2026-03-08T22:47:00.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:00.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-81604378626 2026-03-08T22:47:00.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:00.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:47:00.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-81604378626 2026-03-08T22:47:00.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:00.402 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 81604378626 2026-03-08T22:47:00.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=81604378626 2026-03-08T22:47:00.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 81604378626' 2026-03-08T22:47:00.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:47:00.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 81604378626 -lt 81604378626 2026-03-08T22:47:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:47:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:00.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:47:00.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:47:00.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:47:00.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:47:00.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:47:00.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:47:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:47:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:47:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:47:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:63: TEST_recovery_scrub_1: ceph osd dump 2026-03-08T22:47:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:63: TEST_recovery_scrub_1: awk '{ print $2 }' 2026-03-08T22:47:01.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:63: TEST_recovery_scrub_1: grep '^pool.*['\'']test['\'']' 2026-03-08T22:47:01.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:63: TEST_recovery_scrub_1: poolid=1 2026-03-08T22:47:01.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:65: TEST_recovery_scrub_1: ceph pg dump pgs 2026-03-08T22:47:01.504 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:47:01.504 INFO:tasks.workunit.client.0.vm03.stdout:1.0 0 0 0 0 0 0 0 0 0 0 0 active+clean 2026-03-08T22:46:57.263388+0000 0'0 22:11 [1] 1 [1] 1 0'0 2026-03-08T22:46:57.039977+0000 0'0 2026-03-08T22:46:57.039977+0000 0 0 periodic scrub scheduled @ 2026-03-10T05:17:33.653120+0000 0 0 2026-03-08T22:47:01.504 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:47:01.504 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:47:01.504 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:47:01.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:67: TEST_recovery_scrub_1: dd if=/dev/urandom of=testdata.34483 bs=1M count=50 2026-03-08T22:47:01.634 INFO:tasks.workunit.client.0.vm03.stderr:50+0 records in 2026-03-08T22:47:01.634 INFO:tasks.workunit.client.0.vm03.stderr:50+0 records out 2026-03-08T22:47:01.634 INFO:tasks.workunit.client.0.vm03.stderr:52428800 bytes (52 MB, 50 MiB) copied, 0.116675 s, 449 MB/s 2026-03-08T22:47:01.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: seq 1 100 2026-03-08T22:47:01.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:01.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj1 testdata.34483 2026-03-08T22:47:01.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:01.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj2 testdata.34483 2026-03-08T22:47:02.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:02.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj3 testdata.34483 2026-03-08T22:47:02.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:02.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj4 testdata.34483 2026-03-08T22:47:02.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:02.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj5 testdata.34483 2026-03-08T22:47:02.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:02.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj6 testdata.34483 2026-03-08T22:47:02.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:02.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj7 testdata.34483 2026-03-08T22:47:03.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj8 testdata.34483 2026-03-08T22:47:03.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj9 testdata.34483 2026-03-08T22:47:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj10 testdata.34483 2026-03-08T22:47:03.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj11 testdata.34483 2026-03-08T22:47:03.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj12 testdata.34483 2026-03-08T22:47:03.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:03.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj13 testdata.34483 2026-03-08T22:47:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj14 testdata.34483 2026-03-08T22:47:04.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:04.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj15 testdata.34483 2026-03-08T22:47:04.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:04.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj16 testdata.34483 2026-03-08T22:47:04.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:04.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj17 testdata.34483 2026-03-08T22:47:04.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:04.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj18 testdata.34483 2026-03-08T22:47:05.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:05.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj19 testdata.34483 2026-03-08T22:47:05.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:05.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj20 testdata.34483 2026-03-08T22:47:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj21 testdata.34483 2026-03-08T22:47:06.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:06.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj22 testdata.34483 2026-03-08T22:47:06.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:06.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj23 testdata.34483 2026-03-08T22:47:06.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:06.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj24 testdata.34483 2026-03-08T22:47:06.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:06.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj25 testdata.34483 2026-03-08T22:47:06.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:06.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj26 testdata.34483 2026-03-08T22:47:07.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:07.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj27 testdata.34483 2026-03-08T22:47:07.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:07.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj28 testdata.34483 2026-03-08T22:47:07.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:07.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj29 testdata.34483 2026-03-08T22:47:07.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:07.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj30 testdata.34483 2026-03-08T22:47:08.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:08.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj31 testdata.34483 2026-03-08T22:47:08.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:08.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj32 testdata.34483 2026-03-08T22:47:08.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:08.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj33 testdata.34483 2026-03-08T22:47:08.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:08.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj34 testdata.34483 2026-03-08T22:47:08.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:08.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj35 testdata.34483 2026-03-08T22:47:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj36 testdata.34483 2026-03-08T22:47:09.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj37 testdata.34483 2026-03-08T22:47:09.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj38 testdata.34483 2026-03-08T22:47:09.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj39 testdata.34483 2026-03-08T22:47:09.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj40 testdata.34483 2026-03-08T22:47:09.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:09.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj41 testdata.34483 2026-03-08T22:47:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj42 testdata.34483 2026-03-08T22:47:10.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:10.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj43 testdata.34483 2026-03-08T22:47:10.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:10.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj44 testdata.34483 2026-03-08T22:47:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj45 testdata.34483 2026-03-08T22:47:11.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:11.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj46 testdata.34483 2026-03-08T22:47:11.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:11.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj47 testdata.34483 2026-03-08T22:47:11.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:11.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj48 testdata.34483 2026-03-08T22:47:11.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:11.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj49 testdata.34483 2026-03-08T22:47:11.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:11.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj50 testdata.34483 2026-03-08T22:47:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj51 testdata.34483 2026-03-08T22:47:12.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:12.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj52 testdata.34483 2026-03-08T22:47:12.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:12.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj53 testdata.34483 2026-03-08T22:47:12.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:12.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj54 testdata.34483 2026-03-08T22:47:12.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:12.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj55 testdata.34483 2026-03-08T22:47:13.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:13.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj56 testdata.34483 2026-03-08T22:47:13.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:13.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj57 testdata.34483 2026-03-08T22:47:13.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:13.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj58 testdata.34483 2026-03-08T22:47:13.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:13.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj59 testdata.34483 2026-03-08T22:47:14.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:14.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj60 testdata.34483 2026-03-08T22:47:14.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:14.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj61 testdata.34483 2026-03-08T22:47:14.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:14.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj62 testdata.34483 2026-03-08T22:47:14.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:14.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj63 testdata.34483 2026-03-08T22:47:14.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:14.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj64 testdata.34483 2026-03-08T22:47:15.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj65 testdata.34483 2026-03-08T22:47:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj66 testdata.34483 2026-03-08T22:47:15.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:15.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj67 testdata.34483 2026-03-08T22:47:15.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:15.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj68 testdata.34483 2026-03-08T22:47:16.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:16.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj69 testdata.34483 2026-03-08T22:47:16.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:16.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj70 testdata.34483 2026-03-08T22:47:16.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:16.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj71 testdata.34483 2026-03-08T22:47:16.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:16.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj72 testdata.34483 2026-03-08T22:47:16.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:16.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj73 testdata.34483 2026-03-08T22:47:17.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:17.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj74 testdata.34483 2026-03-08T22:47:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj75 testdata.34483 2026-03-08T22:47:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj76 testdata.34483 2026-03-08T22:47:17.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:17.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj77 testdata.34483 2026-03-08T22:47:17.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:17.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj78 testdata.34483 2026-03-08T22:47:18.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj79 testdata.34483 2026-03-08T22:47:18.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj80 testdata.34483 2026-03-08T22:47:18.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj81 testdata.34483 2026-03-08T22:47:18.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj82 testdata.34483 2026-03-08T22:47:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj83 testdata.34483 2026-03-08T22:47:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj84 testdata.34483 2026-03-08T22:47:19.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:19.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj85 testdata.34483 2026-03-08T22:47:19.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:19.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj86 testdata.34483 2026-03-08T22:47:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj87 testdata.34483 2026-03-08T22:47:19.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:19.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj88 testdata.34483 2026-03-08T22:47:19.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:19.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj89 testdata.34483 2026-03-08T22:47:20.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:20.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj90 testdata.34483 2026-03-08T22:47:20.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:20.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj91 testdata.34483 2026-03-08T22:47:20.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:20.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj92 testdata.34483 2026-03-08T22:47:20.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:20.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj93 testdata.34483 2026-03-08T22:47:20.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:20.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj94 testdata.34483 2026-03-08T22:47:21.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:21.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj95 testdata.34483 2026-03-08T22:47:21.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:21.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj96 testdata.34483 2026-03-08T22:47:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj97 testdata.34483 2026-03-08T22:47:21.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:21.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj98 testdata.34483 2026-03-08T22:47:22.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:22.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj99 testdata.34483 2026-03-08T22:47:22.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:68: TEST_recovery_scrub_1: for i in $(seq 1 $OBJECTS) 2026-03-08T22:47:22.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:70: TEST_recovery_scrub_1: rados -p test put obj100 testdata.34483 2026-03-08T22:47:22.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:72: TEST_recovery_scrub_1: rm -f testdata.34483 2026-03-08T22:47:22.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:74: TEST_recovery_scrub_1: ceph osd pool set test size 4 2026-03-08T22:47:22.682 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 4 2026-03-08T22:47:22.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:77: TEST_recovery_scrub_1: set -o pipefail 2026-03-08T22:47:22.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:78: TEST_recovery_scrub_1: count=0 2026-03-08T22:47:22.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:79: TEST_recovery_scrub_1: true 2026-03-08T22:47:22.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:81: TEST_recovery_scrub_1: ceph --format json pg dump pgs 2026-03-08T22:47:22.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: grep -q true 2026-03-08T22:47:22.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: jq '.pg_stats | [.[] | .state | contains("recovering")]' 2026-03-08T22:47:22.843 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:47:22.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:86: TEST_recovery_scrub_1: sleep 2 2026-03-08T22:47:24.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:87: TEST_recovery_scrub_1: test 0 -eq 10 2026-03-08T22:47:24.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:92: TEST_recovery_scrub_1: expr 0 + 1 2026-03-08T22:47:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:92: TEST_recovery_scrub_1: count=1 2026-03-08T22:47:24.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:79: TEST_recovery_scrub_1: true 2026-03-08T22:47:24.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:81: TEST_recovery_scrub_1: ceph --format json pg dump pgs 2026-03-08T22:47:24.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: grep -q true 2026-03-08T22:47:24.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: jq '.pg_stats | [.[] | .state | contains("recovering")]' 2026-03-08T22:47:25.082 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:47:25.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:86: TEST_recovery_scrub_1: sleep 2 2026-03-08T22:47:27.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:87: TEST_recovery_scrub_1: test 1 -eq 10 2026-03-08T22:47:27.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:92: TEST_recovery_scrub_1: expr 1 + 1 2026-03-08T22:47:27.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:92: TEST_recovery_scrub_1: count=2 2026-03-08T22:47:27.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:79: TEST_recovery_scrub_1: true 2026-03-08T22:47:27.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:81: TEST_recovery_scrub_1: ceph --format json pg dump pgs 2026-03-08T22:47:27.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: grep -q true 2026-03-08T22:47:27.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:82: TEST_recovery_scrub_1: jq '.pg_stats | [.[] | .state | contains("recovering")]' 2026-03-08T22:47:27.394 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:47:27.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:84: TEST_recovery_scrub_1: break 2026-03-08T22:47:27.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:94: TEST_recovery_scrub_1: set +o pipefail 2026-03-08T22:47:27.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:95: TEST_recovery_scrub_1: ceph pg dump pgs 2026-03-08T22:47:27.587 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:47:27.587 INFO:tasks.workunit.client.0.vm03.stdout:1.0 100 0 297 0 0 5242880000 0 0 1300 0 1300 active+recovering+undersized+degraded+remapped 2026-03-08T22:47:24.262417+0000 22'1300 25:1326 [1,0,2,3] 1 [1,0] 1 0'0 2026-03-08T22:46:57.039977+0000 0'0 2026-03-08T22:46:57.039977+0000 0 0 no scrub is scheduled 0 0 2026-03-08T22:47:27.587 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:47:27.587 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:47:27.587 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:47:27.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:97: TEST_recovery_scrub_1: sleep 10 2026-03-08T22:47:37.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:99: TEST_recovery_scrub_1: kill_daemons td/osd-recovery-scrub 2026-03-08T22:47:37.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:37.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:37.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:37.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:37.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:42.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:42.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:101: TEST_recovery_scrub_1: declare -a err_strings 2026-03-08T22:47:42.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:102: TEST_recovery_scrub_1: err_strings[0]='recovery in progress. Only high priority scrubs allowed.' 2026-03-08T22:47:42.959 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: expr 4 - 1 2026-03-08T22:47:42.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: seq 0 3 2026-03-08T22:47:42.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:106: TEST_recovery_scrub_1: grep 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.0.log 2026-03-08T22:47:42.962 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:34.768+0000 7fb55b8c1640 15 osd.0 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.962 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:37.720+0000 7fb55b8c1640 15 osd.0 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.962 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:39.756+0000 7fb55b8c1640 15 osd.0 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:106: TEST_recovery_scrub_1: grep 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.1.log 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:24.332+0000 7fc9a6bac640 15 osd.1 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:27.360+0000 7fc9a6bac640 15 osd.1 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:32.252+0000 7fc9a6bac640 15 osd.1 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:37.360+0000 7fc9a6bac640 15 osd.1 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:106: TEST_recovery_scrub_1: grep 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.2.log 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:25.400+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:26.388+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:27.404+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:28.428+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:29.400+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:33.448+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:34.452+0000 7f11cf6aa640 15 osd.2 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:104: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:106: TEST_recovery_scrub_1: grep 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.3.log 2026-03-08T22:47:42.972 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:27.624+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.972 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:30.628+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.972 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:32.676+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.972 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:34.752+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.972 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:35.800+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:36.748+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:47:39.840+0000 7f39a8b9c640 15 osd.3 osd-scrub:restrictions_on_scrubbing: recovery in progress. Only high priority scrubs allowed. 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:108: TEST_recovery_scrub_1: for err_string in "${err_strings[@]}" 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:110: TEST_recovery_scrub_1: found=false 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:111: TEST_recovery_scrub_1: count=0 2026-03-08T22:47:42.973 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: expr 4 - 1 2026-03-08T22:47:42.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: seq 0 3 2026-03-08T22:47:42.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:114: TEST_recovery_scrub_1: grep -q 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.0.log 2026-03-08T22:47:42.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:116: TEST_recovery_scrub_1: found=true 2026-03-08T22:47:42.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: expr 0 + 1 2026-03-08T22:47:42.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: count=1 2026-03-08T22:47:42.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:114: TEST_recovery_scrub_1: grep -q 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.1.log 2026-03-08T22:47:42.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:116: TEST_recovery_scrub_1: found=true 2026-03-08T22:47:42.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: expr 1 + 1 2026-03-08T22:47:42.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: count=2 2026-03-08T22:47:42.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:114: TEST_recovery_scrub_1: grep -q 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.2.log 2026-03-08T22:47:42.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:116: TEST_recovery_scrub_1: found=true 2026-03-08T22:47:42.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: expr 2 + 1 2026-03-08T22:47:42.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: count=3 2026-03-08T22:47:42.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:112: TEST_recovery_scrub_1: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T22:47:42.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:114: TEST_recovery_scrub_1: grep -q 'recovery in progress. Only high priority scrubs allowed.' td/osd-recovery-scrub/osd.3.log 2026-03-08T22:47:42.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:116: TEST_recovery_scrub_1: found=true 2026-03-08T22:47:42.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: expr 3 + 1 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:117: TEST_recovery_scrub_1: count=4 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:120: TEST_recovery_scrub_1: '[' true = false ']' 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:124: TEST_recovery_scrub_1: '[' 4 -eq 4 ']' 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:127: TEST_recovery_scrub_1: teardown td/osd-recovery-scrub 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-recovery-scrub 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-recovery-scrub KILL 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:42.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:42.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:42.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:42.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:42.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:42.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:43.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:43.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:43.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:43.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:43.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-recovery-scrub 2026-03-08T22:47:43.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:43.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:47:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.34483 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:129: TEST_recovery_scrub_1: '[' 0 '!=' 0 ']' 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stdout:TEST PASSED 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:135: TEST_recovery_scrub_1: echo 'TEST PASSED' 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-recovery-scrub.sh:136: TEST_recovery_scrub_1: return 0 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-recovery-scrub 0 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-recovery-scrub 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-recovery-scrub KILL 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:43.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:43.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:43.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:43.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:43.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:43.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:43.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:43.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:43.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:43.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:43.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:43.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:47:43.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-recovery-scrub 2026-03-08T22:47:43.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:43.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.34483 2026-03-08T22:47:43.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.34483 2026-03-08T22:47:43.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:43.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:43.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:47:43.038 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:47:43.038 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:47:43.090 INFO:tasks.workunit:Running workunit scrub/osd-scrub-dump.sh... 2026-03-08T22:47:43.090 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-scrub-dump.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-dump.sh 2026-03-08T22:47:43.138 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-scrub-dump 2026-03-08T22:47:43.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-dump.sh:30: run: echo 'This test is disabled' 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stdout:This test is disabled 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-dump.sh:31: run: return 0 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-scrub-dump 0 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-dump 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-dump KILL 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:43.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:43.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:43.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:43.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:43.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:43.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:43.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:43.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:43.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:43.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:43.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-dump 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43000 2026-03-08T22:47:43.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43000 2026-03-08T22:47:43.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:43.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:43.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T22:47:43.149 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T22:47:43.149 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T22:47:43.198 INFO:tasks.workunit:Running workunit scrub/osd-scrub-repair.sh... 2026-03-08T22:47:43.198 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-scrub-repair.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:+ source /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ TIMEOUT=300 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ WAIT_FOR_CLEAN_TIMEOUT=90 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ MAX_TIMEOUT=15 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ PG_NUM=4 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ TMPDIR=/tmp 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ CEPH_BUILD_VIRTUALENV=/tmp 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ TESTDIR=/home/ubuntu/cephtest 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ type xmlstarlet 2026-03-08T22:47:43.244 INFO:tasks.workunit.client.0.vm03.stderr:++ XMLSTARLET=xmlstarlet 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:+++ uname 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' Linux = FreeBSD ']' 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:++ SED=sed 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:++ AWK=awk 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:+++ stty -a 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:+++ sed -e 's/.*columns \([0-9]*\).*/\1/' 2026-03-08T22:47:43.245 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T22:47:43.246 INFO:tasks.workunit.client.0.vm03.stderr:+++ head -1 2026-03-08T22:47:43.248 INFO:tasks.workunit.client.0.vm03.stderr:++ termwidth= 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ '[' -n '' -a '' '!=' 0 ']' 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ DIFFCOLOPTS='-y ' 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ KERNCORE=kernel.core_pattern 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ EXTRA_OPTS= 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ test '' = TESTS 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:+ source /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:++ uname 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:+ '[' Linux = FreeBSD ']' 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:+ use_ec_overwrite=true 2026-03-08T22:47:43.249 INFO:tasks.workunit.client.0.vm03.stderr:+ getjson=no 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:+ jqfilter='def walk(f): 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T22:47:43.250 INFO:tasks.workunit.client.0.vm03.stderr:+ sortkeys='import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:+ main osd-scrub-repair 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:+ local dir=td/osd-scrub-repair 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:+ shift 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:+ shopt -s -o xtrace 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T22:47:43.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-scrub-repair 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:52: run: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:53: run: shift 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:55: run: export CEPH_MON=127.0.0.1:7107 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:55: run: CEPH_MON=127.0.0.1:7107 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:56: run: export CEPH_ARGS 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:57: run: uuidgen 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:57: run: CEPH_ARGS+='--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none ' 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:58: run: CEPH_ARGS+='--mon-host=127.0.0.1:7107 ' 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:59: run: CEPH_ARGS+='--osd-skip-data-digest=false ' 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:61: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:62: run: set 2026-03-08T22:47:43.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:62: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:62: run: local 'funcs=TEST_allow_repair_during_recovery 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_bluestore_basic 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_bluestore_failed 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_bluestore_failed_norecov 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_bluestore_scrub 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_bluestore_tag 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_erasure_coded_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_auto_repair_erasure_coded_overwrites 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_and_repair_jerasure_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_and_repair_jerasure_overwrites 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_and_repair_lrc_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_and_repair_lrc_overwrites 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_and_repair_replicated 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_scrub_erasure_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_scrub_erasure_overwrites 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_scrub_replicated 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_corrupt_snapset_scrub_rep 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_list_missing_erasure_coded_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_list_missing_erasure_coded_overwrites 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_periodic_scrub_replicated 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_repair_stats 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_repair_stats_ec 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_request_scrub_priority 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_warning 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_skip_non_repair_during_recovery 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_unfound_erasure_coded_appends 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:TEST_unfound_erasure_coded_overwrites' 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:47:43.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:47:43.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:47:43.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:47:43.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:47:43.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:47:43.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:47:43.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:47:43.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:47:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:47:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:47:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:47:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:47:43.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:47:43.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:47:43.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:47:43.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:47:43.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_allow_repair_during_recovery td/osd-scrub-repair 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:114: TEST_allow_repair_during_recovery: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:115: TEST_allow_repair_during_recovery: local poolname=rbd 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:117: TEST_allow_repair_during_recovery: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:47:43.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.291 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:43.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T22:47:43.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:47:43.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:47:43.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:43.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:43.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:47:43.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:47:43.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:43.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:43.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:43.322 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:43.322 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.322 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.322 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:47:43.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:43.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:47:43.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:47:43.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:47:43.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:47:43.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:47:43.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:118: TEST_allow_repair_during_recovery: run_mgr td/osd-scrub-repair x 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:47:43.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:43.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:43.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:119: TEST_allow_repair_during_recovery: run_osd td/osd-scrub-repair 0 --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:43.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:43.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:43.568 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:43.568 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:43.568 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:43.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:43.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:43.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:43.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:43.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true' 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:47:43.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:43.576 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 9a822072-bcb5-4cf4-b2d5-d100caf8c721 2026-03-08T22:47:43.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9a822072-bcb5-4cf4-b2d5-d100caf8c721 2026-03-08T22:47:43.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 9a822072-bcb5-4cf4-b2d5-d100caf8c721' 2026-03-08T22:47:43.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:43.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCP/K1pIfWZIhAAzE5MGeChzKYkKt/J3sOoSA== 2026-03-08T22:47:43.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCP/K1pIfWZIhAAzE5MGeChzKYkKt/J3sOoSA=="}' 2026-03-08T22:47:43.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9a822072-bcb5-4cf4-b2d5-d100caf8c721 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:47:43.684 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:47:43.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:47:43.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true --mkfs --key AQCP/K1pIfWZIhAAzE5MGeChzKYkKt/J3sOoSA== --osd-uuid 9a822072-bcb5-4cf4-b2d5-d100caf8c721 2026-03-08T22:47:43.714 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:43.700+0000 7ffbabc0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:43.719 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:43.708+0000 7ffbabc0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:43.722 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:43.708+0000 7ffbabc0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:43.722 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:43.708+0000 7ffbabc0b8c0 -1 bdev(0x55f79590ac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:43.722 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:43.708+0000 7ffbabc0b8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:47:45.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:47:45.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:45.984 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:47:45.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:47:45.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:46.110 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:47:46.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:47:46.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T22:47:46.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:46.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:46.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:46.125 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:46.112+0000 7f42d1cb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.127 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:46.116+0000 7f42d1cb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.129 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:46.116+0000 7f42d1cb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:46.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:46.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:47.331 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:47.320+0000 7f42d1cb78c0 -1 Falling back to public interface 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:47.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:47.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:48.298 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:48.284+0000 7f42d1cb78c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:48.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:48.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:49.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1765775890,v1:127.0.0.1:6803/1765775890] [v2:127.0.0.1:6804/1765775890,v1:127.0.0.1:6805/1765775890] exists,up 9a822072-bcb5-4cf4-b2d5-d100caf8c721 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:122: TEST_allow_repair_during_recovery: run_osd td/osd-scrub-repair 1 --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:47:49.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:47:49.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true' 2026-03-08T22:47:49.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:47:49.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:47:49.925 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 dd71471d-3534-4e39-8c94-02c41b79ae2d 2026-03-08T22:47:49.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=dd71471d-3534-4e39-8c94-02c41b79ae2d 2026-03-08T22:47:49.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 dd71471d-3534-4e39-8c94-02c41b79ae2d' 2026-03-08T22:47:49.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:47:49.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCV/K1p2MdrNxAAUcNdMujFlA03exExVAfl6w== 2026-03-08T22:47:49.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCV/K1p2MdrNxAAUcNdMujFlA03exExVAfl6w=="}' 2026-03-08T22:47:49.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new dd71471d-3534-4e39-8c94-02c41b79ae2d -i td/osd-scrub-repair/1/new.json 2026-03-08T22:47:50.090 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:47:50.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:47:50.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true --mkfs --key AQCV/K1p2MdrNxAAUcNdMujFlA03exExVAfl6w== --osd-uuid dd71471d-3534-4e39-8c94-02c41b79ae2d 2026-03-08T22:47:50.120 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:50.108+0000 7f15b08468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:50.122 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:50.108+0000 7f15b08468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:50.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:50.112+0000 7f15b08468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:50.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:50.112+0000 7f15b08468c0 -1 bdev(0x55c31dac9c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:47:50.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:50.112+0000 7f15b08468c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:47:52.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:47:52.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:47:52.375 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:47:52.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:47:52.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:47:52.570 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:47:52.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:47:52.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T22:47:52.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:47:52.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:47:52.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:47:52.588 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:52.576+0000 7fca0674b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:52.589 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:52.576+0000 7fca0674b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:52.591 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:52.576+0000 7fca0674b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:52.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:52.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:53.539 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:53.528+0000 7fca0674b8c0 -1 Falling back to public interface 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:54.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:54.514 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:47:54.500+0000 7fca0674b8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:47:55.083 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:47:55.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:55.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:55.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:47:55.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:55.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:55.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:47:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:47:56.423 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/4089510632,v1:127.0.0.1:6811/4089510632] [v2:127.0.0.1:6812/4089510632,v1:127.0.0.1:6813/4089510632] exists,up dd71471d-3534-4e39-8c94-02c41b79ae2d 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:125: TEST_allow_repair_during_recovery: create_rbd_pool 2026-03-08T22:47:56.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:47:56.574 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T22:47:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:47:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:47:56.770 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T22:47:56.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:47:57.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:126: TEST_allow_repair_during_recovery: wait_for_clean 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:47:58.079 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:47:58.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:47:58.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:47:58.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:47:58.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:47:58.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:47:58.292 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T22:47:58.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:47:58.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:58.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:47:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T22:47:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T22:47:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T22:47:58.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:47:58.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:47:58.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T22:47:58.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T22:47:58.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T22:47:58.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:47:58.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T22:47:58.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:47:58.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:47:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T22:47:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:47:58.439 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T22:47:58.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T22:47:58.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T22:47:58.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:47:58.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T22:47:58.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:47:59.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:47:59.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:47:59.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T22:47:59.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:00.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:48:00.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:00.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836483 2026-03-08T22:48:00.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:00.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T22:48:00.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:00.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:00.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T22:48:00.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:00.910 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T22:48:00.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T22:48:00.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T22:48:00.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:01.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T22:48:01.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:01.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:01.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:01.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:01.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:01.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:48:01.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:01.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:01.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:128: TEST_allow_repair_during_recovery: add_something td/osd-scrub-repair rbd 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=rbd 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T22:48:01.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T22:48:01.784 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T22:48:01.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T22:48:01.993 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T22:48:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T22:48:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T22:48:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool rbd put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:129: TEST_allow_repair_during_recovery: get_not_primary rbd SOMETHING 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=rbd 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary rbd SOMETHING 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=rbd 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T22:48:02.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:48:02.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:48:02.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T22:48:02.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:129: TEST_allow_repair_during_recovery: corrupt_and_repair_one td/osd-scrub-repair rbd 0 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=rbd 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=0 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:02.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:02.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:02.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:02.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:48:02.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T22:48:03.122 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:03.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:48:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:03.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:48:03.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:48:03.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:48:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:48:03.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:48:03.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:03.656+0000 7f76a2ac48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.679 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:03.668+0000 7f76a2ac48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:03.668+0000 7f76a2ac48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:03.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:03.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:04.379 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:04.368+0000 7f76a2ac48c0 -1 Falling back to public interface 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:05.362 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:05.348+0000 7f76a2ac48c0 -1 osd.0 19 log_to_monitors true 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:06.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:06.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:07.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 24 down_at 20 last_clean_interval [5,19) [v2:127.0.0.1:6802/1208773963,v1:127.0.0.1:6803/1208773963] [v2:127.0.0.1:6804/1208773963,v1:127.0.0.1:6805/1208773963] exists,up 9a822072-bcb5-4cf4-b2d5-d100caf8c721 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:07.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:07.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:07.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:07.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:07.712 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T22:48:07.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:07.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T22:48:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T22:48:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T22:48:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:07.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672965' 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T22:48:07.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:07.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:07.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T22:48:07.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T22:48:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T22:48:07.862 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T22:48:07.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:08.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T22:48:08.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:09.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:09.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:09.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T22:48:09.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:09.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T22:48:09.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:09.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:09.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T22:48:09.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:09.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T22:48:09.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T22:48:09.178 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672965 2026-03-08T22:48:09.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:09.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T22:48:09.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:09.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:09.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:09.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:09.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:48:09.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:09.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:09.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg rbd SOMETHING 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=rbd 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:48:09.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map rbd SOMETHING 2026-03-08T22:48:09.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=1.3 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 1.3 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.3 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.3 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:48:10.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T22:48:10.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:10.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.3 2026-03-08T22:48:10.346 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.3 on osd.1 to repair 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.3 2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.3 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:48:10.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T22:48:10.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:47:56.760551+0000 '>' 2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:10.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:48:11.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T22:48:11.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:47:56.760551+0000 '>' 2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:11.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:48:12.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:48:12.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:48:12.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T22:48:12.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T22:48:12.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:48:12.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:48:12.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T22:48:12.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:10.478288+0000 '>' 2026-03-08T22:47:56.760551+0000 2026-03-08T22:48:12.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:48:12.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:48:12.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:48:12.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:12.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:48:12.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:48:13.292 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T22:48:13.292 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:48:13.292 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:13.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:13.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:48:13.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:48:13.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:48:13.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:48:13.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:48:13.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:13.580+0000 7f650d5b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:13.593 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:13.580+0000 7f650d5b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:13.594 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:13.580+0000 7f650d5b58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:13.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:14.539 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:14.528+0000 7f650d5b58c0 -1 Falling back to public interface 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:15.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:15.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:15.524+0000 7f650d5b58c0 -1 osd.1 26 log_to_monitors true 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:16.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:16.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:16.433 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:16.420+0000 7f6504565640 -1 osd.1 26 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 30 up_thru 30 down_at 27 last_clean_interval [10,26) [v2:127.0.0.1:6810/210880356,v1:127.0.0.1:6811/210880356] [v2:127.0.0.1:6812/210880356,v1:127.0.0.1:6813/210880356] exists,up dd71471d-3534-4e39-8c94-02c41b79ae2d 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:17.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:17.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:17.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:17.618 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T22:48:17.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:17.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:17.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:17.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215109 2026-03-08T22:48:17.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215109 2026-03-08T22:48:17.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215109' 2026-03-08T22:48:17.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:17.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018882 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018882 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215109 1-128849018882' 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215109 2026-03-08T22:48:17.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:17.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:17.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215109 2026-03-08T22:48:17.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:17.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215109 2026-03-08T22:48:17.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215109' 2026-03-08T22:48:17.760 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215109 2026-03-08T22:48:17.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:17.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215108 -lt 103079215109 2026-03-08T22:48:17.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:18.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:18.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215109 -lt 103079215109 2026-03-08T22:48:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:19.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-128849018882 2026-03-08T22:48:19.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:19.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:19.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-128849018882 2026-03-08T22:48:19.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:19.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018882 2026-03-08T22:48:19.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 128849018882' 2026-03-08T22:48:19.077 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 128849018882 2026-03-08T22:48:19.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:19.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018882 -lt 128849018882 2026-03-08T22:48:19.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:19.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:19.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:19.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:19.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:48:19.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:19.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:19.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:19.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:48:19.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:19.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:19.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool rbd get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:48:19.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:19.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:19.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:19.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:48:19.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:48:19.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:48:19.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:48:19.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:48:19.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:48:19.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:19.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:48:19.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:48:19.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:19.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:19.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:48:19.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:48:19.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:48:19.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:19.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:19.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:19.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:19.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:19.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:48:19.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:48:19.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:48:19.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:48:19.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:48:19.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:48:19.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:19.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:48:19.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:48:19.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:48:19.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:48:19.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:48:19.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:48:19.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:48:19.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:19.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:19.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:48:19.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:48:19.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:48:19.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:48:19.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:48:19.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:19.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:19.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_bluestore_basic td/osd-scrub-repair 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:438: TEST_auto_repair_bluestore_basic: local dir=td/osd-scrub-repair 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:439: TEST_auto_repair_bluestore_basic: cluster_conf=(['osds_num']='3' ['pgs_in_pool']='1' ['pool_name']='testpool' ['extras']=' --osd_scrub_auto_repair=true') 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:439: TEST_auto_repair_bluestore_basic: local -A cluster_conf 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:445: TEST_auto_repair_bluestore_basic: local extr_dbg=3 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:446: TEST_auto_repair_bluestore_basic: standard_scrub_cluster td/osd-scrub-repair cluster_conf 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:229: standard_scrub_cluster: local dir=td/osd-scrub-repair 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:230: standard_scrub_cluster: local -n args=cluster_conf 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:232: standard_scrub_cluster: local OSDS=3 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:233: standard_scrub_cluster: local pg_num=1 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:234: standard_scrub_cluster: local poolname=testpool 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:235: standard_scrub_cluster: args['pool_name']=testpool 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:236: standard_scrub_cluster: local 'extra_pars= --osd_scrub_auto_repair=true' 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:237: standard_scrub_cluster: local debug_msg=dbg 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:240: standard_scrub_cluster: local saved_echo_flag=x 2026-03-08T22:48:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:241: standard_scrub_cluster: set +x 2026-03-08T22:48:20.242 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 a4e5fa69-ecca-4ab4-9ebf-6b5b5313f68a 2026-03-08T22:48:20.357 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:48:20.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:20.372+0000 7f3588eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:20.395 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:20.384+0000 7f3588eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:20.396 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:20.384+0000 7f3588eb88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:20.396 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:20.384+0000 7f3588eb88c0 -1 bdev(0x55d85c139c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:20.396 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:20.384+0000 7f3588eb88c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:48:22.666 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:48:22.792 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:48:22.810 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:22.796+0000 7f91d13e88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:22.810 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:22.796+0000 7f91d13e88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:22.811 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:22.800+0000 7f91d13e88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:22.958 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:48:23.995 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:23.984+0000 7f91d13e88c0 -1 Falling back to public interface 2026-03-08T22:48:24.119 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:48:24.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:24.948+0000 7f91d13e88c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:48:25.289 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:48:26.465 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:48:26.631 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3198357618,v1:127.0.0.1:6803/3198357618] [v2:127.0.0.1:6804/3198357618,v1:127.0.0.1:6805/3198357618] exists,up a4e5fa69-ecca-4ab4-9ebf-6b5b5313f68a 2026-03-08T22:48:26.635 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 ff5283d9-e05c-4831-b89d-84c6690eb274 2026-03-08T22:48:26.827 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:48:26.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:26.848+0000 7f8853f088c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:26.860 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:26.848+0000 7f8853f088c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:26.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:26.848+0000 7f8853f088c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:26.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:26.848+0000 7f8853f088c0 -1 bdev(0x557dd50b5c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:26.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:26.848+0000 7f8853f088c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:48:30.348 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:48:30.538 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:48:30.555 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:30.544+0000 7f795203b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:30.563 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:30.552+0000 7f795203b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:30.564 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:30.552+0000 7f795203b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:30.705 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:48:31.527 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:31.516+0000 7f795203b8c0 -1 Falling back to public interface 2026-03-08T22:48:31.861 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:48:32.514 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:32.500+0000 7f795203b8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:48:33.016 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:48:34.203 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:48:34.370 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/594936957,v1:127.0.0.1:6811/594936957] [v2:127.0.0.1:6812/594936957,v1:127.0.0.1:6813/594936957] exists,up ff5283d9-e05c-4831-b89d-84c6690eb274 2026-03-08T22:48:34.374 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 0e35ee69-fe1f-4b1e-af2c-7b48b3fe2008 2026-03-08T22:48:34.537 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:48:34.568 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:34.556+0000 7faa48ac68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:34.570 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:34.556+0000 7faa48ac68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:34.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:34.560+0000 7faa48ac68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:34.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:34.560+0000 7faa48ac68c0 -1 bdev(0x559b3b067c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:48:34.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:34.560+0000 7faa48ac68c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:48:36.840 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:48:37.026 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:48:37.041 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:37.028+0000 7f9b454328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:37.048 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:37.036+0000 7f9b454328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:37.049 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:37.036+0000 7f9b454328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:37.191 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:48:38.247 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:38.236+0000 7f9b454328c0 -1 Falling back to public interface 2026-03-08T22:48:38.347 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:48:39.236 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:39.224+0000 7f9b454328c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:48:39.505 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:48:40.700 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:48:40.883 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/548443830,v1:127.0.0.1:6819/548443830] [v2:127.0.0.1:6820/548443830,v1:127.0.0.1:6821/548443830] exists,up 0e35ee69-fe1f-4b1e-af2c-7b48b3fe2008 2026-03-08T22:48:41.094 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T22:48:42.543 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:48:43.861 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:48:44.024 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542146 2026-03-08T22:48:45.896 INFO:tasks.workunit.client.0.vm03.stdout:standard_scrub_cluster: dbg: test pool is testpool 1 2026-03-08T22:48:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:447: TEST_auto_repair_bluestore_basic: local poolid=1 2026-03-08T22:48:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:448: TEST_auto_repair_bluestore_basic: local poolname=testpool 2026-03-08T22:48:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:450: TEST_auto_repair_bluestore_basic: ceph osd pool set testpool size 2 2026-03-08T22:48:46.104 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T22:48:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:451: TEST_auto_repair_bluestore_basic: wait_for_clean 2026-03-08T22:48:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:46.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:46.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:46.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:46.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:46.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:46.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:46.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:46.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:46.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:46.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:46.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:46.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:46.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:46.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:46.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:48:46.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:48:46.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:48:46.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:46.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:46.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T22:48:46.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T22:48:46.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965' 2026-03-08T22:48:46.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:46.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:46.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542148 2026-03-08T22:48:46.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542148 2026-03-08T22:48:46.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-60129542148' 2026-03-08T22:48:46.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:48:46.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:46.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:48:46.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:48:46.593 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:48:46.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:48:46.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:46.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T22:48:46.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T22:48:46.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:46.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T22:48:46.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T22:48:46.759 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T22:48:46.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T22:48:46.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:46.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T22:48:46.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:46.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542148 2026-03-08T22:48:46.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:46.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:46.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542148 2026-03-08T22:48:46.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:46.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542148 2026-03-08T22:48:46.917 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542148 2026-03-08T22:48:46.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542148' 2026-03-08T22:48:46.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:47.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542148 -lt 60129542148 2026-03-08T22:48:47.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:47.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:47.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:47.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:47.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:48:47.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:47.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:47.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' -1 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=0 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:48:47.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:48:47.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:48:47.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:47.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:47.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:47.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:47.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:47.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:47.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:47.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:48:47.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:47.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:47.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:48.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:48:48.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:48:48.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:48:48.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:48:48.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:48:48.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:48:48.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:48:48.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:48:48.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:48:48.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:48:48.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:48:48.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:48:48.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:48:48.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:48:48.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:48.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:48:48.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:48.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:48.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:48:48.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:48:48.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:48:48.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:48:49.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:48:49.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:48:49.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 2 >= 13 )) 2026-03-08T22:48:49.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:48:49.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.4 2026-03-08T22:48:49.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:49.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:49.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:48:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:48:49.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:48:49.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:48:49.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:48:49.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 3 >= 13 )) 2026-03-08T22:48:49.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:48:49.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.8 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:48:50.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:48:50.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:48:50.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:48:50.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:50.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:454: TEST_auto_repair_bluestore_basic: local payload=ABCDEF 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:455: TEST_auto_repair_bluestore_basic: echo ABCDEF 2026-03-08T22:48:51.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:456: TEST_auto_repair_bluestore_basic: rados --pool testpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:48:51.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:460: TEST_auto_repair_bluestore_basic: get_not_primary testpool SOMETHING 2026-03-08T22:48:51.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:48:51.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:48:51.134 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:48:51.134 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:48:51.135 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:48:51.135 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:48:51.135 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:48:51.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:48:51.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:48:51.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:460: TEST_auto_repair_bluestore_basic: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:48:51.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:48:51.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T22:48:52.233 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:48:52.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:48:52.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:48:52.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:48:52.771 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:48:52.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:48:52.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:48:52.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:48:52.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:48:52.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:48:52.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:48:52.791 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:52.776+0000 7f73b3fce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:52.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:52.780+0000 7f73b3fce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:52.796 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:52.784+0000 7f73b3fce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:53.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:53.995 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:53.984+0000 7f73b3fce8c0 -1 Falling back to public interface 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:54.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:54.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:54.984 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:48:54.972+0000 7f73b3fce8c0 -1 osd.0 20 log_to_monitors true 2026-03-08T22:48:55.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:48:55.278 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:48:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:55.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:48:56.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:48:56.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:48:56.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:48:56.466 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:48:56.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:48:56.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:48:56.628 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 0 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/908200531,v1:127.0.0.1:6803/908200531] [v2:127.0.0.1:6804/908200531,v1:127.0.0.1:6805/908200531] exists,up a4e5fa69-ecca-4ab4-9ebf-6b5b5313f68a 2026-03-08T22:48:56.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:48:56.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:48:56.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:48:56.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:48:56.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:48:56.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:56.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:48:56.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T22:48:56.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T22:48:56.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T22:48:56.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:56.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:48:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T22:48:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T22:48:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672968' 2026-03-08T22:48:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:48:57.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:48:57.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542151 2026-03-08T22:48:57.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542151 2026-03-08T22:48:57.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672968 2-60129542151' 2026-03-08T22:48:57.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:57.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T22:48:57.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:57.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:48:57.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T22:48:57.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:57.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T22:48:57.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T22:48:57.105 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T22:48:57.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:57.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T22:48:57.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:58.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:48:58.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:58.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T22:48:58.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:48:59.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:48:59.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:48:59.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T22:48:59.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:59.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T22:48:59.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:59.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:48:59.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:59.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T22:48:59.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T22:48:59.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T22:48:59.592 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672968 2026-03-08T22:48:59.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:48:59.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672968 2026-03-08T22:48:59.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:48:59.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542151 2026-03-08T22:48:59.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:48:59.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:48:59.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542151 2026-03-08T22:48:59.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:48:59.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542151 2026-03-08T22:48:59.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542151' 2026-03-08T22:48:59.754 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542151 2026-03-08T22:48:59.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:48:59.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542151 -lt 60129542151 2026-03-08T22:48:59.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:48:59.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:48:59.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:00.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:49:00.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:00.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:00.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:49:00.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:00.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:00.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:49:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:49:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:49:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:461: TEST_auto_repair_bluestore_basic: ceph tell 'osd.*' config set osd_scrub_auto_repair true 2026-03-08T22:49:00.555 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T22:49:00.555 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:49:00.555 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:49:00.562 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T22:49:00.562 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:49:00.562 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:49:00.568 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T22:49:00.568 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:49:00.568 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:49:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:463: TEST_auto_repair_bluestore_basic: get_pg testpool SOMETHING 2026-03-08T22:49:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T22:49:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:49:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:00.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:49:00.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:463: TEST_auto_repair_bluestore_basic: local pgid=1.0 2026-03-08T22:49:00.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:464: TEST_auto_repair_bluestore_basic: get_primary testpool SOMETHING 2026-03-08T22:49:00.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:49:00.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:49:00.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:00.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:49:00.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:464: TEST_auto_repair_bluestore_basic: local primary=1 2026-03-08T22:49:00.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:465: TEST_auto_repair_bluestore_basic: get_last_scrub_stamp 1.0 2026-03-08T22:49:00.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:00.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:00.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:00.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:01.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:465: TEST_auto_repair_bluestore_basic: local last_scrub_stamp=2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:01.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:468: TEST_auto_repair_bluestore_basic: ceph tell 1.0 schedule-deep-scrub 2026-03-08T22:49:01.105 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:49:01.105 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:49:01.105 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T22:49:01.105 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-22T22:47:21.099098+0000" 2026-03-08T22:49:01.105 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:471: TEST_auto_repair_bluestore_basic: wait_for_scrub 1.0 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:01.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:01.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:41.086215+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:01.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:49:02.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:49:02.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:02.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:02.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:02.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:02.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:02.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:02.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:41.086215+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:02.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:49:03.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:49:03.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:03.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:41.086215+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:03.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:04.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:04.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:41.086215+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:04.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:49:05.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:49:05.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:05.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:05.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:05.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:05.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:05.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:05.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:48:41.086215+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:05.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:49:06.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:01.641029+0000 '>' 2026-03-08T22:48:41.086215+0000 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:472: TEST_auto_repair_bluestore_basic: wait_for_clean 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:49:07.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:49:07.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:49:07.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:49:07.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:49:07.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:49:07.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:49:07.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:07.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:49:07.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215109 2026-03-08T22:49:07.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215109 2026-03-08T22:49:07.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215109' 2026-03-08T22:49:07.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:07.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:49:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672971 2026-03-08T22:49:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672971 2026-03-08T22:49:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215109 1-42949672971' 2026-03-08T22:49:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:07.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542154 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542154 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215109 1-42949672971 2-60129542154' 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215109 2026-03-08T22:49:07.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:49:07.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215109 2026-03-08T22:49:07.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:07.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215109 2026-03-08T22:49:07.532 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 103079215109 2026-03-08T22:49:07.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215109' 2026-03-08T22:49:07.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:07.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215108 -lt 103079215109 2026-03-08T22:49:07.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:49:08.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:49:08.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:08.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215109 -lt 103079215109 2026-03-08T22:49:08.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:08.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672971 2026-03-08T22:49:08.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:08.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:49:08.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672971 2026-03-08T22:49:08.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:08.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672971 2026-03-08T22:49:08.873 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672971 2026-03-08T22:49:08.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672971' 2026-03-08T22:49:08.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:49:09.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672971 -lt 42949672971 2026-03-08T22:49:09.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:09.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542154 2026-03-08T22:49:09.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:09.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:49:09.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542154 2026-03-08T22:49:09.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:09.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542154 2026-03-08T22:49:09.041 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542154 2026-03-08T22:49:09.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542154' 2026-03-08T22:49:09.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:49:09.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542154 -lt 60129542154 2026-03-08T22:49:09.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:49:09.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:09.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:09.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:09.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:49:09.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:09.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:09.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:49:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:49:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:49:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:473: TEST_auto_repair_bluestore_basic: ceph pg dump pgs 2026-03-08T22:49:09.954 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:49:09.954 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean 2026-03-08T22:49:01.644214+0000 20'1 25:58 [1,0] 1 [1,0] 1 20'1 2026-03-08T22:49:01.641029+0000 20'1 2026-03-08T22:49:01.641029+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:49:01.641029+0000 1 0 2026-03-08T22:49:09.954 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:49:09.954 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:49:09.954 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:476: TEST_auto_repair_bluestore_basic: get_not_primary testpool SOMETHING 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:09.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:49:10.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:49:10.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:10.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:476: TEST_auto_repair_bluestore_basic: objectstore_tool td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:49:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:49:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING list-attrs 2026-03-08T22:49:10.931 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:49:10.931 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:49:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T22:49:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:49:11.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:49:11.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:49:11.221 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:49:11.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:11.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:49:11.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:49:11.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:49:11.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:49:11.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:49:11.237 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:11.224+0000 7f553bbac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:11.237 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:11.224+0000 7f553bbac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:11.239 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:11.228+0000 7f553bbac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:11.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:11.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:11.947 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:11.936+0000 7f553bbac8c0 -1 Falling back to public interface 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:12.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:12.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:12.934 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:12.924+0000 7f553bbac8c0 -1 osd.0 25 log_to_monitors true 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:13.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:13.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 29 up_thru 0 down_at 26 last_clean_interval [24,25) [v2:127.0.0.1:6802/3231256782,v1:127.0.0.1:6803/3231256782] [v2:127.0.0.1:6804/3231256782,v1:127.0.0.1:6805/3231256782] exists,up a4e5fa69-ecca-4ab4-9ebf-6b5b5313f68a 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:49:15.089 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:49:15.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:49:15.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:49:15.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:49:15.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:49:15.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:49:15.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:49:15.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:49:15.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:49:15.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:15.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:49:15.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051586 2026-03-08T22:49:15.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051586 2026-03-08T22:49:15.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586' 2026-03-08T22:49:15.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:15.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:49:15.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T22:49:15.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T22:49:15.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672974' 2026-03-08T22:49:15.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:15.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542156 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542156 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672974 2-60129542156' 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051586 2026-03-08T22:49:15.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:15.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:49:15.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051586 2026-03-08T22:49:15.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:15.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051586 2026-03-08T22:49:15.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051586' 2026-03-08T22:49:15.550 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 124554051586 2026-03-08T22:49:15.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:15.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 124554051586 2026-03-08T22:49:15.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:49:16.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:49:16.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:16.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051586 -lt 124554051586 2026-03-08T22:49:16.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:16.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T22:49:16.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:16.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:49:16.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T22:49:16.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:16.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T22:49:16.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T22:49:16.880 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672974 2026-03-08T22:49:16.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:49:17.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672974 -lt 42949672974 2026-03-08T22:49:17.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:17.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542156 2026-03-08T22:49:17.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:17.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:49:17.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542156 2026-03-08T22:49:17.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:17.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542156 2026-03-08T22:49:17.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542156' 2026-03-08T22:49:17.041 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542156 2026-03-08T22:49:17.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:49:17.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542157 -lt 60129542156 2026-03-08T22:49:17.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:49:17.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:17.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:17.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:49:17.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:17.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:17.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:17.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:17.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:17.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:17.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:17.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:49:17.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:17.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:17.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:49:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:49:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:49:17.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:477: TEST_auto_repair_bluestore_basic: get_not_primary testpool SOMETHING 2026-03-08T22:49:17.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:17.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:49:17.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:49:17.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:17.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:477: TEST_auto_repair_bluestore_basic: objectstore_tool td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:49:18.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:49:18.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:18.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:49:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:18.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:49:18.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:49:18.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:49:18.805 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:49:18.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:49:18.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:49:18.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:49:18.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:49:18.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:49:18.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:49:18.822 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:18.808+0000 7fb56bafd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:18.826 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:18.816+0000 7fb56bafd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:18.828 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:18.816+0000 7fb56bafd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:18.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:19.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:20.003 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:19.993+0000 7fb56bafd8c0 -1 Falling back to public interface 2026-03-08T22:49:20.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:20.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:20.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:20.140 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:20.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:20.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:20.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:20.994 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:20.985+0000 7fb56bafd8c0 -1 osd.0 30 log_to_monitors true 2026-03-08T22:49:21.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:21.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:21.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:21.301 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:49:21.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:21.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:21.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:22.356 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:22.345+0000 7fb562aad640 -1 osd.0 30 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:22.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:22.626 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 34 up_thru 0 down_at 31 last_clean_interval [29,30) [v2:127.0.0.1:6802/1547336922,v1:127.0.0.1:6803/1547336922] [v2:127.0.0.1:6804/1547336922,v1:127.0.0.1:6805/1547336922] exists,up a4e5fa69-ecca-4ab4-9ebf-6b5b5313f68a 2026-03-08T22:49:22.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:49:22.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:49:22.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:22.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:49:22.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T22:49:22.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T22:49:22.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066' 2026-03-08T22:49:22.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:22.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:49:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672976 2026-03-08T22:49:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672976 2026-03-08T22:49:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672976' 2026-03-08T22:49:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:22.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542159 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542159 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672976 2-60129542159' 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-146028888066 2026-03-08T22:49:23.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:49:23.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-146028888066 2026-03-08T22:49:23.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:23.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T22:49:23.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 146028888066' 2026-03-08T22:49:23.053 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 146028888066 2026-03-08T22:49:23.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:23.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 146028888066 2026-03-08T22:49:23.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:49:24.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:49:24.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 146028888066 2026-03-08T22:49:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:49:25.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:49:25.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:25.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888066 -lt 146028888066 2026-03-08T22:49:25.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:25.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672976 2026-03-08T22:49:25.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:25.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:49:25.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672976 2026-03-08T22:49:25.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:25.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672976 2026-03-08T22:49:25.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672976' 2026-03-08T22:49:25.523 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672976 2026-03-08T22:49:25.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:49:25.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672976 -lt 42949672976 2026-03-08T22:49:25.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:25.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542159 2026-03-08T22:49:25.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:25.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:49:25.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542159 2026-03-08T22:49:25.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:25.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542159 2026-03-08T22:49:25.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542159' 2026-03-08T22:49:25.678 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542159 2026-03-08T22:49:25.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:49:25.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542159 -lt 60129542159 2026-03-08T22:49:25.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:49:25.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:25.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:26.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:26.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:49:26.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:26.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:26.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:26.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:49:26.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:49:26.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:49:26.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:478: TEST_auto_repair_bluestore_basic: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:49:26.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:479: TEST_auto_repair_bluestore_basic: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:49:01.636+0000 7f79367c6640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:49:01.636+0000 7f79367c6640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:49:01.636+0000 7f79367c6640 15 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 1 errors. 1 errors fixed 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:49:01.636+0000 7f79367c6640 20 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish All may be fixed 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:49:01.636+0000 7f79367c6640 19 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=15/15 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:49:26.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:49:26.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:49:26.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:49:26.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:49:26.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:49:26.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:49:26.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:49:26.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:49:26.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:49:26.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:49:26.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:49:26.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:49:26.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:49:26.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:49:26.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:49:26.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:49:26.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:49:26.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:49:26.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:49:26.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:49:26.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:49:26.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:49:26.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:49:26.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:49:26.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:49:26.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:49:26.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:49:26.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:49:26.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:49:26.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:49:26.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:49:26.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:49:26.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:49:26.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:49:26.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:49:26.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:49:26.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:49:26.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:49:26.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:49:26.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:49:26.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:49:26.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:49:26.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:49:26.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:49:26.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_bluestore_failed td/osd-scrub-repair 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:539: TEST_auto_repair_bluestore_failed: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:540: TEST_auto_repair_bluestore_failed: local poolname=testpool 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:543: TEST_auto_repair_bluestore_failed: run_mon td/osd-scrub-repair a 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:49:26.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.538 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:26.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:49:26.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:49:26.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:49:26.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:49:26.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:49:26.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:49:26.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:49:26.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:49:26.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:49:26.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.627 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:49:26.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:49:26.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:544: TEST_auto_repair_bluestore_failed: run_mgr td/osd-scrub-repair x 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:49:26.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:49:26.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:49:26.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:26.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:26.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:26.781 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.781 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:26.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:49:26.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:49:26.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:547: TEST_auto_repair_bluestore_failed: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:49:26.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:548: TEST_auto_repair_bluestore_failed: seq 0 2 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:548: TEST_auto_repair_bluestore_failed: for id in $(seq 0 2) 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:549: TEST_auto_repair_bluestore_failed: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:26.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:49:26.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:49:26.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:26.807 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:49:26.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:49:26.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 d26256ab-201b-4cf3-9723-3c20474ea0b8' 2026-03-08T22:49:26.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD2/K1pj7t7MBAAeakdziMSxikImRjUtHHtqA== 2026-03-08T22:49:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD2/K1pj7t7MBAAeakdziMSxikImRjUtHHtqA=="}' 2026-03-08T22:49:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d26256ab-201b-4cf3-9723-3c20474ea0b8 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:49:26.926 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:49:26.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:49:26.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQD2/K1pj7t7MBAAeakdziMSxikImRjUtHHtqA== --osd-uuid d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:49:26.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:26.946+0000 7f6f48df98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.956 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:26.950+0000 7f6f48df98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.958 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:26.950+0000 7f6f48df98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:26.958 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:26.950+0000 7f6f48df98c0 -1 bdev(0x562c62116c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:26.958 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:26.950+0000 7f6f48df98c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:49:29.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:49:29.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:29.209 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:49:29.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:49:29.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:29.405 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:49:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:49:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:29.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:29.419 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:29.410+0000 7f61e99628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.420 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:29.414+0000 7f61e99628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.421 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:29.414+0000 7f61e99628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:29.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:29.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:30.389 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:30.383+0000 7f61e99628c0 -1 Falling back to public interface 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:30.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:31.358 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:31.351+0000 7f61e99628c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:49:31.865 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:49:31.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:31.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:31.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:31.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:31.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:32.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:33.039 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:49:33.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:33.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:33.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:33.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:33.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3149764354,v1:127.0.0.1:6803/3149764354] [v2:127.0.0.1:6804/3149764354,v1:127.0.0.1:6805/3149764354] exists,up d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:548: TEST_auto_repair_bluestore_failed: for id in $(seq 0 2) 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:549: TEST_auto_repair_bluestore_failed: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:33.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:49:33.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:49:33.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:33.203 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:49:33.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:49:33.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 ac846ce5-c18d-4200-a9cb-f5bc986abf24' 2026-03-08T22:49:33.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQD9/K1pycaUDBAAQCH+J1JYzJsphc03KKoQ8A== 2026-03-08T22:49:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQD9/K1pycaUDBAAQCH+J1JYzJsphc03KKoQ8A=="}' 2026-03-08T22:49:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ac846ce5-c18d-4200-a9cb-f5bc986abf24 -i td/osd-scrub-repair/1/new.json 2026-03-08T22:49:33.369 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:49:33.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:49:33.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQD9/K1pycaUDBAAQCH+J1JYzJsphc03KKoQ8A== --osd-uuid ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:49:33.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:33.391+0000 7f985fa8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:33.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:33.395+0000 7f985fa8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:33.401 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:33.395+0000 7f985fa8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:33.402 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:33.395+0000 7f985fa8d8c0 -1 bdev(0x55ab33d47c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:33.402 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:33.395+0000 7f985fa8d8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:49:35.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:49:35.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:35.652 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:49:35.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:49:35.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:35.839 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:49:35.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:49:35.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:35.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:35.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:35.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:35.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:35.852+0000 7f83cff1a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:35.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:35.852+0000 7f83cff1a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:35.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:35.856+0000 7f83cff1a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:36.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:37.044 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:37.040+0000 7f83cff1a8c0 -1 Falling back to public interface 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:37.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:37.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:38.013 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:38.008+0000 7f83cff1a8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:38.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:38.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:39.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1977983778,v1:127.0.0.1:6811/1977983778] [v2:127.0.0.1:6812/1977983778,v1:127.0.0.1:6813/1977983778] exists,up ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:548: TEST_auto_repair_bluestore_failed: for id in $(seq 0 2) 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:549: TEST_auto_repair_bluestore_failed: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:39.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:49:39.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:49:39.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:49:39.662 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 c9125b17-35cc-4ec0-a41b-80761b998045 2026-03-08T22:49:39.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c9125b17-35cc-4ec0-a41b-80761b998045 2026-03-08T22:49:39.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 c9125b17-35cc-4ec0-a41b-80761b998045' 2026-03-08T22:49:39.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:49:39.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAD/a1p02/zJxAA7QT+NDC/K97RxyaR96NB1w== 2026-03-08T22:49:39.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAD/a1p02/zJxAA7QT+NDC/K97RxyaR96NB1w=="}' 2026-03-08T22:49:39.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c9125b17-35cc-4ec0-a41b-80761b998045 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:49:39.827 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:49:39.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:49:39.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQAD/a1p02/zJxAA7QT+NDC/K97RxyaR96NB1w== --osd-uuid c9125b17-35cc-4ec0-a41b-80761b998045 2026-03-08T22:49:39.856 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:39.848+0000 7f13a38908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:39.848+0000 7f13a38908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.858 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:39.852+0000 7f13a38908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:39.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:39.852+0000 7f13a38908c0 -1 bdev(0x5606a5859c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:49:39.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:39.852+0000 7f13a38908c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:49:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:49:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:49:43.095 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:49:43.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:49:43.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:49:43.296 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:49:43.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:49:43.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:43.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:49:43.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:49:43.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:49:43.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:43.305+0000 7fb3ab9448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:43.318 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:43.313+0000 7fb3ab9448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:43.320 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:43.313+0000 7fb3ab9448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:43.464 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:43.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:43.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:44.271 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:44.265+0000 7fb3ab9448c0 -1 Falling back to public interface 2026-03-08T22:49:44.622 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:49:44.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:44.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:44.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:49:44.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:44.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:44.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:45.232 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:45.225+0000 7fb3ab9448c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:49:45.779 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:49:45.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:45.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:45.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:49:45.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:45.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:46.238 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:46.233+0000 7fb3a70fd640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:46.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/964257944,v1:127.0.0.1:6819/964257944] [v2:127.0.0.1:6820/964257944,v1:127.0.0.1:6821/964257944] exists,up c9125b17-35cc-4ec0-a41b-80761b998045 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:552: TEST_auto_repair_bluestore_failed: create_pool testpool 1 1 2026-03-08T22:49:47.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 2026-03-08T22:49:47.272 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T22:49:47.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:49:48.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:553: TEST_auto_repair_bluestore_failed: ceph osd pool set testpool size 2 2026-03-08T22:49:48.485 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T22:49:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:554: TEST_auto_repair_bluestore_failed: wait_for_clean 2026-03-08T22:49:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:49:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:49:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:49:48.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:49:48.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:48.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:49:48.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:49:48.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:49:48.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:49:48.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:48.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:49:48.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T22:49:48.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T22:49:48.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963' 2026-03-08T22:49:48.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:49:48.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963 2-64424509442' 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:49:48.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:48.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:49:48.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:49:48.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:48.939 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:49:48.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:49:48.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:49:48.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:49.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836485 2026-03-08T22:49:49.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:49:50.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:49:50.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:49:50.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T22:49:50.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:50.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T22:49:50.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:50.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:49:50.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T22:49:50.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:50.257 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:49:50.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T22:49:50.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T22:49:50.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:49:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T22:49:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:49:50.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T22:49:50.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:49:50.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:49:50.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T22:49:50.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:49:50.424 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:49:50.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T22:49:50.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T22:49:50.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:49:50.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T22:49:50.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:49:50.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:50.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:50.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:50.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:49:50.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:50.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:50.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' -1 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=0 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:49:51.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:51.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:51.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:51.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:51.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:49:51.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:51.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:51.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:49:51.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:49:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:49:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:49:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:49:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:49:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:51.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:52.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:49:52.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:52.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:52.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:52.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:49:52.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:49:52.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:49:52.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:49:52.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:49:52.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:49:52.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 2 >= 13 )) 2026-03-08T22:49:52.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:49:52.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.4 2026-03-08T22:49:52.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:52.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:53.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:49:53.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:53.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:53.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:49:53.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:49:53.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:49:53.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:49:53.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 3 >= 13 )) 2026-03-08T22:49:53.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:49:53.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.8 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:54.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:54.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:54.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:54.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:49:54.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:54.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:54.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:49:54.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:49:54.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:49:54.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:49:54.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 4 >= 13 )) 2026-03-08T22:49:54.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:49:54.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 1.6 2026-03-08T22:49:56.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:49:56.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:49:56.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:49:56.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:49:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:49:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:557: TEST_auto_repair_bluestore_failed: local payload=ABCDEF 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:558: TEST_auto_repair_bluestore_failed: echo ABCDEF 2026-03-08T22:49:56.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: seq 1 10 2026-03-08T22:49:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj1 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj2 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj3 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj4 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj5 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj6 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj7 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj8 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj9 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:559: TEST_auto_repair_bluestore_failed: for i in $(seq 1 10) 2026-03-08T22:49:56.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:561: TEST_auto_repair_bluestore_failed: rados --pool testpool put obj10 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:566: TEST_auto_repair_bluestore_failed: get_not_primary testpool SOMETHING 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:56.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:49:57.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:49:57.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:49:57.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:566: TEST_auto_repair_bluestore_failed: objectstore_tool td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:49:57.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj1 remove 2026-03-08T22:49:58.165 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:49:58.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:49:58.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:49:58.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:49:58.698 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:49:58.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:49:58.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:49:58.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:49:58.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:49:58.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:49:58.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:49:58.713 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:58.710+0000 7f4acea2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:58.721 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:58.718+0000 7f4acea2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:58.723 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:58.718+0000 7f4acea2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:49:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:49:59.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:49:59.921 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:49:59.918+0000 7f4acea2b8c0 -1 Falling back to public interface 2026-03-08T22:50:00.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:00.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:00.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:00.028 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:00.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:00.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:00.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:00.895 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:00.890+0000 7f4acea2b8c0 -1 osd.0 20 log_to_monitors true 2026-03-08T22:50:01.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:01.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:01.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:01.186 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:50:01.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:01.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:01.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:02.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 0 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/3602828897,v1:127.0.0.1:6803/3602828897] [v2:127.0.0.1:6804/3602828897,v1:127.0.0.1:6805/3602828897] exists,up d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:02.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:02.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:02.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T22:50:02.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T22:50:02.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T22:50:02.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:02.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:02.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672967 2026-03-08T22:50:02.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672967 2026-03-08T22:50:02.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672967' 2026-03-08T22:50:02.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:02.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509446 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509446 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672967 2-64424509446' 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T22:50:02.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:02.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:02.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T22:50:02.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:02.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T22:50:02.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T22:50:02.936 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T22:50:02.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:03.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T22:50:03.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:04.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:04.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:04.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T22:50:04.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:04.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672967 2026-03-08T22:50:04.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:04.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:04.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672967 2026-03-08T22:50:04.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:04.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672967 2026-03-08T22:50:04.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672967' 2026-03-08T22:50:04.253 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672967 2026-03-08T22:50:04.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:04.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672967 -lt 42949672967 2026-03-08T22:50:04.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:04.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509446 2026-03-08T22:50:04.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:04.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509446 2026-03-08T22:50:04.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:04.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509446 2026-03-08T22:50:04.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509446' 2026-03-08T22:50:04.418 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509446 2026-03-08T22:50:04.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:04.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509446 -lt 64424509446 2026-03-08T22:50:04.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:04.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:04.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:04.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:04.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:04.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:04.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:05.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:568: TEST_auto_repair_bluestore_failed: get_not_primary testpool SOMETHING 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:50:05.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:50:05.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:50:05.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:50:05.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:568: TEST_auto_repair_bluestore_failed: objectstore_tool td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:05.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:05.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:05.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:05.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:05.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:50:05.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj2 remove 2026-03-08T22:50:06.179 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:104778fc:::obj2:head# 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:50:06.711 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:50:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:50:06.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:50:06.713 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:50:06.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:06.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:50:06.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:50:06.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:50:06.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:50:06.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:50:06.728 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:06.723+0000 7fe15556f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:06.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:06.735+0000 7fe15556f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:06.739 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:06.735+0000 7fe15556f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:06.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:07.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:07.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:07.935+0000 7fe15556f8c0 -1 Falling back to public interface 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:08.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:08.903+0000 7fe15556f8c0 -1 osd.0 25 log_to_monitors true 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:09.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:09.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:10.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 29 up_thru 0 down_at 26 last_clean_interval [24,25) [v2:127.0.0.1:6802/1062912791,v1:127.0.0.1:6803/1062912791] [v2:127.0.0.1:6804/1062912791,v1:127.0.0.1:6805/1062912791] exists,up d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:10.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:10.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:10.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:10.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:10.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:10.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051586 2026-03-08T22:50:10.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051586 2026-03-08T22:50:10.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586' 2026-03-08T22:50:10.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:10.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:10.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T22:50:10.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T22:50:10.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672970' 2026-03-08T22:50:10.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:10.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509448 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509448 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672970 2-64424509448' 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051586 2026-03-08T22:50:10.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:10.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:10.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051586 2026-03-08T22:50:10.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:10.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051586 2026-03-08T22:50:10.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051586' 2026-03-08T22:50:10.968 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 124554051586 2026-03-08T22:50:10.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:11.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 124554051586 2026-03-08T22:50:11.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:12.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:12.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:12.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051586 -lt 124554051586 2026-03-08T22:50:12.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:12.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T22:50:12.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:12.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:12.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T22:50:12.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:12.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T22:50:12.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T22:50:12.281 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672970 2026-03-08T22:50:12.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:12.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T22:50:12.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:12.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509448 2026-03-08T22:50:12.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:12.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:12.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509448 2026-03-08T22:50:12.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:12.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509448 2026-03-08T22:50:12.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509448' 2026-03-08T22:50:12.436 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509448 2026-03-08T22:50:12.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:12.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509448 -lt 64424509448 2026-03-08T22:50:12.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:12.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:12.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:12.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:12.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:12.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:12.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:12.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:569: TEST_auto_repair_bluestore_failed: get_primary testpool SOMETHING 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:50:13.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:50:13.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:569: TEST_auto_repair_bluestore_failed: objectstore_tool td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:13.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:13.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:50:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj2 rm-attr _ 2026-03-08T22:50:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:50:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:50:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:50:15.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:50:15.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:50:15.024 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:50:15.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:15.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:50:15.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:50:15.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:50:15.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:50:15.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:50:15.039 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:15.035+0000 7f4f198c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:15.047 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:15.047+0000 7f4f198c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:15.049 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:15.047+0000 7f4f198c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:15.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:15.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:15.988 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:15.987+0000 7f4f198c38c0 -1 Falling back to public interface 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:16.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:16.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:17.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:17.207+0000 7f4f198c38c0 -1 osd.1 30 log_to_monitors true 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:17.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:17.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:18.106 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:18.103+0000 7f4f10873640 -1 osd.1 30 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 34 up_thru 34 down_at 31 last_clean_interval [10,30) [v2:127.0.0.1:6810/2706245168,v1:127.0.0.1:6811/2706245168] [v2:127.0.0.1:6812/2706245168,v1:127.0.0.1:6813/2706245168] exists,up ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:18.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:18.828 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:18.828 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:18.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:18.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:18.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:18.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:19.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:19.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051589 2026-03-08T22:50:19.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051589 2026-03-08T22:50:19.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051589' 2026-03-08T22:50:19.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:19.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T22:50:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T22:50:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051589 1-146028888066' 2026-03-08T22:50:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:19.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:19.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509451 2026-03-08T22:50:19.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509451 2026-03-08T22:50:19.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051589 1-146028888066 2-64424509451' 2026-03-08T22:50:19.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:19.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051589 2026-03-08T22:50:19.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:19.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:19.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051589 2026-03-08T22:50:19.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:19.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051589 2026-03-08T22:50:19.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051589' 2026-03-08T22:50:19.267 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 124554051589 2026-03-08T22:50:19.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:19.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051589 -lt 124554051589 2026-03-08T22:50:19.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:19.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888066 2026-03-08T22:50:19.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:19.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:19.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888066 2026-03-08T22:50:19.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:19.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T22:50:19.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888066' 2026-03-08T22:50:19.431 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 146028888066 2026-03-08T22:50:19.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:19.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 146028888066 2026-03-08T22:50:19.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:20.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:20.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:20.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 146028888066 2026-03-08T22:50:20.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:21.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:50:21.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:21.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888066 -lt 146028888066 2026-03-08T22:50:21.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:21.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509451 2026-03-08T22:50:21.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:21.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:21.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509451 2026-03-08T22:50:21.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:21.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509451 2026-03-08T22:50:21.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509451' 2026-03-08T22:50:21.907 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509451 2026-03-08T22:50:21.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:22.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509451 -lt 64424509451 2026-03-08T22:50:22.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:22.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:22.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:22.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:22.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:22.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:22.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:22.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:571: TEST_auto_repair_bluestore_failed: get_pg testpool obj1 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=obj1 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool obj1 2026-03-08T22:50:22.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:50:22.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:571: TEST_auto_repair_bluestore_failed: local pgid=1.0 2026-03-08T22:50:22.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:572: TEST_auto_repair_bluestore_failed: get_primary testpool obj1 2026-03-08T22:50:22.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:50:22.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T22:50:22.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T22:50:22.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:50:22.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:572: TEST_auto_repair_bluestore_failed: local primary=1 2026-03-08T22:50:22.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:573: TEST_auto_repair_bluestore_failed: get_last_scrub_stamp 1.0 2026-03-08T22:50:22.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:22.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:22.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:22.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:23.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:573: TEST_auto_repair_bluestore_failed: local last_scrub_stamp=2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:23.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:574: TEST_auto_repair_bluestore_failed: ceph tell 1.0 schedule-deep-scrub 2026-03-08T22:50:23.116 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:50:23.116 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:50:23.116 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T22:50:23.116 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-22T22:48:43.117527+0000" 2026-03-08T22:50:23.116 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:577: TEST_auto_repair_bluestore_failed: wait_for_scrub 1.0 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:23.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:23.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:24.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:24.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:24.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:25.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:25.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:25.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:26.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:26.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:27.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:27.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:27.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:28.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:28.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:28.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:28.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:28.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:28.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:28.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:29.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:49:47.269575+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:29.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:30.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:30.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:30.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:30.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:30.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:30.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:30.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:49:47.269575+0000 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:578: TEST_auto_repair_bluestore_failed: wait_for_clean 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:30.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:30.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:30.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:30.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051592 2026-03-08T22:50:30.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051592 2026-03-08T22:50:30.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051592' 2026-03-08T22:50:30.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:30.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:30.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888069 2026-03-08T22:50:30.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888069 2026-03-08T22:50:30.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051592 1-146028888069' 2026-03-08T22:50:30.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:30.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509454 2026-03-08T22:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509454 2026-03-08T22:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051592 1-146028888069 2-64424509454' 2026-03-08T22:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:30.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051592 2026-03-08T22:50:30.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:30.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:30.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051592 2026-03-08T22:50:30.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:30.644 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 124554051592 2026-03-08T22:50:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051592 2026-03-08T22:50:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051592' 2026-03-08T22:50:30.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:30.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051591 -lt 124554051592 2026-03-08T22:50:30.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:31.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:31.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:31.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051592 -lt 124554051592 2026-03-08T22:50:31.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:31.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888069 2026-03-08T22:50:31.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:31.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:31.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888069 2026-03-08T22:50:31.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:31.958 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 146028888069 2026-03-08T22:50:31.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888069 2026-03-08T22:50:31.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888069' 2026-03-08T22:50:31.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888069 -lt 146028888069 2026-03-08T22:50:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:32.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509454 2026-03-08T22:50:32.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:32.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:32.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509454 2026-03-08T22:50:32.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:32.111 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509454 2026-03-08T22:50:32.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509454 2026-03-08T22:50:32.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509454' 2026-03-08T22:50:32.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:32.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509454 -lt 64424509454 2026-03-08T22:50:32.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:32.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:32.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:32.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:32.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:32.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:32.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:32.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:579: TEST_auto_repair_bluestore_failed: flush_pg_stats 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:32.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:32.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:33.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051593 2026-03-08T22:50:33.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051593 2026-03-08T22:50:33.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051593' 2026-03-08T22:50:33.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:33.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:33.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888071 2026-03-08T22:50:33.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888071 2026-03-08T22:50:33.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051593 1-146028888071' 2026-03-08T22:50:33.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:33.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509456 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509456 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051593 1-146028888071 2-64424509456' 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051593 2026-03-08T22:50:33.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:33.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:33.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051593 2026-03-08T22:50:33.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051593 2026-03-08T22:50:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051593' 2026-03-08T22:50:33.164 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 124554051593 2026-03-08T22:50:33.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051593 -lt 124554051593 2026-03-08T22:50:33.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:33.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888071 2026-03-08T22:50:33.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:33.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:33.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888071 2026-03-08T22:50:33.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:33.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888071 2026-03-08T22:50:33.326 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 146028888071 2026-03-08T22:50:33.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888071' 2026-03-08T22:50:33.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:33.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888071 -lt 146028888071 2026-03-08T22:50:33.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:33.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509456 2026-03-08T22:50:33.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:33.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:33.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509456 2026-03-08T22:50:33.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:33.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509456 2026-03-08T22:50:33.487 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509456 2026-03-08T22:50:33.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509456' 2026-03-08T22:50:33.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:33.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509455 -lt 64424509456 2026-03-08T22:50:33.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:34.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:34.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:34.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509455 -lt 64424509456 2026-03-08T22:50:34.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:35.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:50:35.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:35.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509456 -lt 64424509456 2026-03-08T22:50:35.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:580: TEST_auto_repair_bluestore_failed: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 15 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 5 errors. 2 errors fixed 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 20 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish Current 'required': 0 Planned 'req_scrub': 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] planned CHECK_REPAIR MUST_DEEP_SCRUB MUST_SCRUB] scrubber: scrub_finish before flags: CHECK_REPAIR REQ_SCRUB. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent+failed_repair [ 1.0: REQ_SCRUB ] ] scrubber: scrub_finish 3 error(s) still present after re-scrub 2026-03-08T22:50:35.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:581: TEST_auto_repair_bluestore_failed: grep -q 'scrub_finish.*still present after re-scrub' td/osd-scrub-repair/osd.1.log 2026-03-08T22:50:35.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:582: TEST_auto_repair_bluestore_failed: ceph pg dump pgs 2026-03-08T22:50:36.096 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:50:36.096 INFO:tasks.workunit.client.0.vm03.stdout:1.0 10 0 0 0 0 63 0 0 10 0 10 active+clean+inconsistent+failed_repair 2026-03-08T22:50:24.129857+0000 20'10 35:103 [1,0] 1 [1,0] 1 20'10 2026-03-08T22:50:24.129820+0000 20'10 2026-03-08T22:50:24.129820+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:50:24.129820+0000 10 0 2026-03-08T22:50:36.096 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:50:36.096 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:50:36.096 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:50:36.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:583: TEST_auto_repair_bluestore_failed: ceph pg dump pgs 2026-03-08T22:50:36.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:583: TEST_auto_repair_bluestore_failed: grep -q '^1.0.*+failed_repair' 2026-03-08T22:50:36.244 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:50:36.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:587: TEST_auto_repair_bluestore_failed: get_not_primary testpool obj1 2026-03-08T22:50:36.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool obj1 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T22:50:36.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:50:36.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:50:36.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool obj1 2026-03-08T22:50:36.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:587: TEST_auto_repair_bluestore_failed: objectstore_tool td/osd-scrub-repair 0 obj1 list-attrs 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 obj1 list-attrs 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:50:36.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:36.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 obj1 list-attrs 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:50:36.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj1 list-attrs 2026-03-08T22:50:37.001 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:50:37.001 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:50:37.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:37.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:37.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:50:37.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:50:37.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:50:37.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:50:37.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:50:37.304 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:37.304+0000 7f7bf19d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.305 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:37.304+0000 7f7bf19d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.307 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:37.304+0000 7f7bf19d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:37.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:37.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:38.503 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:38.504+0000 7f7bf19d88c0 -1 Falling back to public interface 2026-03-08T22:50:38.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:38.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:38.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:38.609 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:38.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:38.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:38.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:39.505 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:39.504+0000 7f7bf19d88c0 -1 osd.0 35 log_to_monitors true 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:39.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:39.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:40.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 39 up_thru 31 down_at 36 last_clean_interval [29,35) [v2:127.0.0.1:6802/4237213398,v1:127.0.0.1:6803/4237213398] [v2:127.0.0.1:6804/4237213398,v1:127.0.0.1:6805/4237213398] exists,up d26256ab-201b-4cf3-9723-3c20474ea0b8 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:41.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:41.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:41.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:41.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724546 2026-03-08T22:50:41.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724546 2026-03-08T22:50:41.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724546' 2026-03-08T22:50:41.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:41.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888073 2026-03-08T22:50:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888073 2026-03-08T22:50:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724546 1-146028888073' 2026-03-08T22:50:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:41.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:41.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509459 2026-03-08T22:50:41.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509459 2026-03-08T22:50:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724546 1-146028888073 2-64424509459' 2026-03-08T22:50:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:41.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-167503724546 2026-03-08T22:50:41.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:41.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-167503724546 2026-03-08T22:50:41.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:41.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724546 2026-03-08T22:50:41.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 167503724546' 2026-03-08T22:50:41.534 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 167503724546 2026-03-08T22:50:41.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:41.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 167503724546 2026-03-08T22:50:41.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:42.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:42.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:42.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 167503724546 2026-03-08T22:50:42.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:43.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:50:43.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:44.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724546 -lt 167503724546 2026-03-08T22:50:44.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:44.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888073 2026-03-08T22:50:44.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:44.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:44.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888073 2026-03-08T22:50:44.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:44.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888073 2026-03-08T22:50:44.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888073' 2026-03-08T22:50:44.025 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 146028888073 2026-03-08T22:50:44.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:44.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888074 -lt 146028888073 2026-03-08T22:50:44.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:44.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509459 2026-03-08T22:50:44.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:44.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:44.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509459 2026-03-08T22:50:44.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:44.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509459 2026-03-08T22:50:44.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509459' 2026-03-08T22:50:44.190 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509459 2026-03-08T22:50:44.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:44.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509459 -lt 64424509459 2026-03-08T22:50:44.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:44.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:44.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:44.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:44.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:44.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:44.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:44.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:44.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:44.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:44.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:44.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:588: TEST_auto_repair_bluestore_failed: rados --pool testpool get obj1 td/osd-scrub-repair/COPY 2026-03-08T22:50:44.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:589: TEST_auto_repair_bluestore_failed: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:50:44.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:590: TEST_auto_repair_bluestore_failed: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 15 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 5 errors. 2 errors fixed 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 20 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish Current 'required': 0 Planned 'req_scrub': 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] planned CHECK_REPAIR MUST_DEEP_SCRUB MUST_SCRUB] scrubber: scrub_finish before flags: CHECK_REPAIR REQ_SCRUB. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent+failed_repair [ 1.0: REQ_SCRUB ] ] scrubber: scrub_finish 3 error(s) still present after re-scrub 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:593: TEST_auto_repair_bluestore_failed: get_primary testpool SOMETHING 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:50:44.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:50:44.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:593: TEST_auto_repair_bluestore_failed: objectstore_tool td/osd-scrub-repair 1 obj2 remove 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 obj2 remove 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:50:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 obj2 remove 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:50:45.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj2 remove 2026-03-08T22:50:45.835 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:104778fc:::obj2:head#, (61) No data available 2026-03-08T22:50:45.835 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:104778fc:::obj2:head# 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:50:46.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:50:46.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:50:46.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:50:46.369 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:50:46.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:50:46.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:50:46.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:50:46.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:50:46.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:50:46.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:50:46.386 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:46.384+0000 7f17c13f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:46.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:46.384+0000 7f17c13f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:46.388 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:46.388+0000 7f17c13f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:46.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:46.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:47.095 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:47.096+0000 7f17c13f78c0 -1 Falling back to public interface 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:47.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:47.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:48.095 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:50:48.096+0000 7f17c13f78c0 -1 osd.1 40 log_to_monitors true 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:48.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:49.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:50:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 44 up_thru 44 down_at 41 last_clean_interval [34,40) [v2:127.0.0.1:6810/2724057046,v1:127.0.0.1:6811/2724057046] [v2:127.0.0.1:6812/2724057046,v1:127.0.0.1:6813/2724057046] exists,up ac846ce5-c18d-4200-a9cb-f5bc986abf24 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:50:50.214 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:50:50.215 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:50:50.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:50:50.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:50:50.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:50:50.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:50.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:50:50.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724549 2026-03-08T22:50:50.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724549 2026-03-08T22:50:50.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724549' 2026-03-08T22:50:50.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:50.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:50:50.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561026 2026-03-08T22:50:50.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561026 2026-03-08T22:50:50.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724549 1-188978561026' 2026-03-08T22:50:50.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:50:50.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:50:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509461 2026-03-08T22:50:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509461 2026-03-08T22:50:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724549 1-188978561026 2-64424509461' 2026-03-08T22:50:50.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:50.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-167503724549 2026-03-08T22:50:50.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:50.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:50:50.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-167503724549 2026-03-08T22:50:50.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:50.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724549 2026-03-08T22:50:50.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 167503724549' 2026-03-08T22:50:50.682 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 167503724549 2026-03-08T22:50:50.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:50.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724547 -lt 167503724549 2026-03-08T22:50:50.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:50:51.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:50:51.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:50:52.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724549 -lt 167503724549 2026-03-08T22:50:52.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:52.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-188978561026 2026-03-08T22:50:52.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:52.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:50:52.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-188978561026 2026-03-08T22:50:52.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:52.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561026 2026-03-08T22:50:52.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 188978561026' 2026-03-08T22:50:52.002 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 188978561026 2026-03-08T22:50:52.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:50:52.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561026 -lt 188978561026 2026-03-08T22:50:52.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:50:52.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509461 2026-03-08T22:50:52.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:50:52.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:50:52.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509461 2026-03-08T22:50:52.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:50:52.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509461 2026-03-08T22:50:52.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509461' 2026-03-08T22:50:52.163 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509461 2026-03-08T22:50:52.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:50:52.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509461 -lt 64424509461 2026-03-08T22:50:52.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:50:52.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:52.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:50:52.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:50:52.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:50:52.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:50:52.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:50:52.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:50:52.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:594: TEST_auto_repair_bluestore_failed: repair 1.0 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:52.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:53.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:53.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T22:50:53.167 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to repair 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:53.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:53.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:54.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:54.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:54.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:54.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:55.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:55.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:55.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:55.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:56.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:56.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:56.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:57.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:57.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:57.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:24.129820+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:57.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:50:58.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:50:58.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:50:59.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:50:54.067365+0000 '>' 2026-03-08T22:50:24.129820+0000 2026-03-08T22:50:59.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:50:59.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:595: TEST_auto_repair_bluestore_failed: sleep 2 2026-03-08T22:51:01.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:597: TEST_auto_repair_bluestore_failed: flush_pg_stats 2026-03-08T22:51:01.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:01.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:01.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:01.306 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:01.306 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:01.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:01.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:01.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724552 2026-03-08T22:51:01.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724552 2026-03-08T22:51:01.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724552' 2026-03-08T22:51:01.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561029 2026-03-08T22:51:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561029 2026-03-08T22:51:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724552 1-188978561029' 2026-03-08T22:51:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:01.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509465 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509465 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724552 1-188978561029 2-64424509465' 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-167503724552 2026-03-08T22:51:01.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:01.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:01.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-167503724552 2026-03-08T22:51:01.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:01.518 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 167503724552 2026-03-08T22:51:01.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724552 2026-03-08T22:51:01.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 167503724552' 2026-03-08T22:51:01.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:01.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724551 -lt 167503724552 2026-03-08T22:51:01.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:51:02.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:51:02.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:02.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724551 -lt 167503724552 2026-03-08T22:51:02.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:51:03.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:51:03.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724552 -lt 167503724552 2026-03-08T22:51:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:03.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-188978561029 2026-03-08T22:51:03.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:03.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:03.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-188978561029 2026-03-08T22:51:03.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:03.984 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 188978561029 2026-03-08T22:51:03.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561029 2026-03-08T22:51:03.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 188978561029' 2026-03-08T22:51:03.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:04.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561029 -lt 188978561029 2026-03-08T22:51:04.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:04.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509465 2026-03-08T22:51:04.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:04.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:04.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509465 2026-03-08T22:51:04.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:04.140 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509465 2026-03-08T22:51:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509465 2026-03-08T22:51:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509465' 2026-03-08T22:51:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:04.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509465 -lt 64424509465 2026-03-08T22:51:04.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:598: TEST_auto_repair_bluestore_failed: ceph pg dump pgs 2026-03-08T22:51:04.429 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:51:04.430 INFO:tasks.workunit.client.0.vm03.stdout:1.0 9 0 0 0 0 63 0 0 10 0 10 active+clean 2026-03-08T22:50:54.067408+0000 20'10 45:141 [1,0] 1 [1,0] 1 20'10 2026-03-08T22:50:54.067365+0000 20'10 2026-03-08T22:50:54.067365+0000 0 0 periodic scrub scheduled @ 2026-03-09T22:50:54.067365+0000 9 0 2026-03-08T22:51:04.430 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:51:04.430 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:51:04.430 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:51:04.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:599: TEST_auto_repair_bluestore_failed: ceph pg dump pgs 2026-03-08T22:51:04.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:599: TEST_auto_repair_bluestore_failed: grep -q -e '^1.0.* active+clean ' -e '^1.0.* active+clean+wait ' 2026-03-08T22:51:04.578 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:51:04.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:600: TEST_auto_repair_bluestore_failed: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 15 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 5 errors. 2 errors fixed 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 20 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish Current 'required': 0 Planned 'req_scrub': 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.124+0000 7f4efd84d640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] planned CHECK_REPAIR MUST_DEEP_SCRUB MUST_SCRUB] scrubber: scrub_finish before flags: CHECK_REPAIR REQ_SCRUB. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 19 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: CHECK_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:24.128+0000 7f4efe84f640 10 osd.1 pg_epoch: 35 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=34/35 n=10 ec=16/16 lis/c=34/34 les/c/f=35/35/0 sis=34) [1,0] r=0 lpr=34 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent+failed_repair [ 1.0: REQ_SCRUB ] ] scrubber: scrub_finish 3 error(s) still present after re-scrub 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:54.064+0000 7f17a4b80640 10 osd.1 pg_epoch: 45 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=44/45 n=10 ec=16/16 lis/c=44/44 les/c/f=45/45/0 sis=44) [1,0] r=0 lpr=44 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+repair [ 1.0: AUTO_REPAIR REQ_SCRUB ] MUST_REPAIR planned AUTO_REPAIR MUST_DEEP_SCRUB MUST_SCRUB planned REQ_SCRUB] scrubber: scrub_finish before flags: AUTO_REPAIR REQ_SCRUB. repair state: repair. deep_scrub_on_error: 0 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:54.064+0000 7f17a4b80640 10 osd.1 pg_epoch: 45 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=44/45 n=10 ec=16/16 lis/c=44/44 les/c/f=45/45/0 sis=44) [1,0] r=0 lpr=44 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+repair [ 1.0: AUTO_REPAIR REQ_SCRUB ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:54.064+0000 7f17a4b80640 15 osd.1 pg_epoch: 45 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=44/45 n=9 ec=16/16 lis/c=44/44 les/c/f=45/45/0 sis=44) [1,0] r=0 lpr=44 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+repair [ 1.0: AUTO_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish: 1 errors. 1 errors fixed 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:54.064+0000 7f17a4b80640 20 osd.1 pg_epoch: 45 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=44/45 n=9 ec=16/16 lis/c=44/44 les/c/f=45/45/0 sis=44) [1,0] r=0 lpr=44 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+repair [ 1.0: AUTO_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish All may be fixed 2026-03-08T22:51:04.590 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:50:54.064+0000 7f17a4b80640 19 osd.1 pg_epoch: 45 pg[1.0( v 20'10 (0'0,20'10] local-lis/les=44/45 n=9 ec=16/16 lis/c=44/44 les/c/f=45/45/0 sis=44) [1,0] r=0 lpr=44 crt=20'10 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+repair [ 1.0: AUTO_REPAIR REQ_SCRUB ] ] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:04.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:04.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:04.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:51:04.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:51:04.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:51:04.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:51:04.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:51:04.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:51:04.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:04.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:51:04.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:51:04.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:04.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:51:04.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:51:04.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:51:04.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:51:04.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:04.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:04.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:04.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:51:04.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:51:04.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:51:04.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:51:04.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:51:04.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:51:04.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:04.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:51:04.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:51:04.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:51:04.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:51:04.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:51:04.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:51:04.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:51:04.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:51:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:51:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:51:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:51:04.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:51:04.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:51:04.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_bluestore_failed_norecov td/osd-scrub-repair 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:604: TEST_auto_repair_bluestore_failed_norecov: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:605: TEST_auto_repair_bluestore_failed_norecov: local poolname=testpool 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:608: TEST_auto_repair_bluestore_failed_norecov: run_mon td/osd-scrub-repair a 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:51:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:04.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:51:04.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:51:04.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:51:04.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:51:04.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:51:04.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:51:04.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:51:04.782 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:51:04.782 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:51:04.782 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:51:04.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:51:04.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:51:04.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:51:04.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:51:04.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:51:04.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:51:04.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:51:04.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:609: TEST_auto_repair_bluestore_failed_norecov: run_mgr td/osd-scrub-repair x 2026-03-08T22:51:04.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:51:04.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:51:04.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:51:04.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:51:04.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:51:04.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:05.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:05.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:51:05.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:51:05.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:612: TEST_auto_repair_bluestore_failed_norecov: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:05.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:613: TEST_auto_repair_bluestore_failed_norecov: seq 0 2 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:613: TEST_auto_repair_bluestore_failed_norecov: for id in $(seq 0 2) 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:614: TEST_auto_repair_bluestore_failed_norecov: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:05.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:51:05.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:05.029 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:05.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:05.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4' 2026-03-08T22:51:05.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:05.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBZ/a1pI9ZzAhAAkusD/78+agFb6eBPlYzM1w== 2026-03-08T22:51:05.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBZ/a1pI9ZzAhAAkusD/78+agFb6eBPlYzM1w=="}' 2026-03-08T22:51:05.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:51:05.132 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:51:05.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:51:05.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBZ/a1pI9ZzAhAAkusD/78+agFb6eBPlYzM1w== --osd-uuid 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:05.158 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:05.156+0000 7f01bc2ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:05.163 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:05.164+0000 7f01bc2ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:05.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:05.164+0000 7f01bc2ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:05.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:05.164+0000 7f01bc2ab8c0 -1 bdev(0x5625d6331c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:05.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:05.164+0000 7f01bc2ab8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:51:07.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:51:07.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:07.428 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:51:07.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:51:07.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:07.616 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:51:07.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:51:07.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:07.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:07.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:07.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:07.631 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:07.632+0000 7fefc10bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:07.638 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:07.640+0000 7fefc10bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:07.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:07.640+0000 7fefc10bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:07.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:08.835 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:08.832+0000 7fefc10bb8c0 -1 Falling back to public interface 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:08.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:09.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:09.799 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:09.796+0000 7fefc10bb8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:10.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:10.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:11.274 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:51:11.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:11.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:11.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:11.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:11.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:11.430 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4120313827,v1:127.0.0.1:6803/4120313827] [v2:127.0.0.1:6804/4120313827,v1:127.0.0.1:6805/4120313827] exists,up 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:11.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:11.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:11.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:613: TEST_auto_repair_bluestore_failed_norecov: for id in $(seq 0 2) 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:614: TEST_auto_repair_bluestore_failed_norecov: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:11.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:11.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:51:11.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:11.433 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:11.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:11.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1' 2026-03-08T22:51:11.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:11.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBf/a1pqSOMGhAA1AdqeMxLdwkWJj/NekObuw== 2026-03-08T22:51:11.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBf/a1pqSOMGhAA1AdqeMxLdwkWJj/NekObuw=="}' 2026-03-08T22:51:11.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 -i td/osd-scrub-repair/1/new.json 2026-03-08T22:51:11.601 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:51:11.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:51:11.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBf/a1pqSOMGhAA1AdqeMxLdwkWJj/NekObuw== --osd-uuid bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:11.630 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:11.624+0000 7f72c0acd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:11.631 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:11.628+0000 7f72c0acd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:11.632 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:11.628+0000 7f72c0acd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:11.633 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:11.628+0000 7f72c0acd8c0 -1 bdev(0x559207e11c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:11.633 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:11.628+0000 7f72c0acd8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:51:13.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:51:13.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:13.900 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:51:13.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:51:13.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:14.106 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:51:14.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:51:14.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:14.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:14.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:14.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:14.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:14.116+0000 7f821b5628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:14.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:14.116+0000 7f821b5628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:14.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:14.120+0000 7f821b5628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:14.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:14.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:15.323 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:15.320+0000 7f821b5628c0 -1 Falling back to public interface 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:15.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:16.289 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:16.284+0000 7f821b5628c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:16.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:16.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:17.380 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:17.376+0000 7f8216d1b640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:51:17.800 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:51:17.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:17.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:17.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:17.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:17.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/583189646,v1:127.0.0.1:6811/583189646] [v2:127.0.0.1:6812/583189646,v1:127.0.0.1:6813/583189646] exists,up bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:613: TEST_auto_repair_bluestore_failed_norecov: for id in $(seq 0 2) 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:614: TEST_auto_repair_bluestore_failed_norecov: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:51:17.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:17.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:51:17.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:51:17.982 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 4d501598-7c3f-43ec-a230-a09752bf28e7 2026-03-08T22:51:17.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4d501598-7c3f-43ec-a230-a09752bf28e7 2026-03-08T22:51:17.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 4d501598-7c3f-43ec-a230-a09752bf28e7' 2026-03-08T22:51:17.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:51:17.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBl/a1pjRtdOxAA32AXN6l6ADhS1XzHPvC55Q== 2026-03-08T22:51:17.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBl/a1pjRtdOxAA32AXN6l6ADhS1XzHPvC55Q=="}' 2026-03-08T22:51:17.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4d501598-7c3f-43ec-a230-a09752bf28e7 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:51:18.171 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:51:18.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:51:18.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBl/a1pjRtdOxAA32AXN6l6ADhS1XzHPvC55Q== --osd-uuid 4d501598-7c3f-43ec-a230-a09752bf28e7 2026-03-08T22:51:18.203 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:18.200+0000 7f5b6f2fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:18.205 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:18.200+0000 7f5b6f2fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:18.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:18.200+0000 7f5b6f2fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:18.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:18.200+0000 7f5b6f2fe8c0 -1 bdev(0x55f642db1c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:51:18.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:18.204+0000 7f5b6f2fe8c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:51:20.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:51:20.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:51:20.726 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:51:20.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:51:20.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:51:20.953 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:51:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:51:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:51:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:51:20.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:51:20.971 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:20.964+0000 7f776c0ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:20.975 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:20.972+0000 7f776c0ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:20.977 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:20.972+0000 7f776c0ab8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:21.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:22.171 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:22.168+0000 7f776c0ab8c0 -1 Falling back to public interface 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:22.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:22.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:23.170 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:23.164+0000 7f776c0ab8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:23.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:23.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1012611596,v1:127.0.0.1:6819/1012611596] [v2:127.0.0.1:6820/1012611596,v1:127.0.0.1:6821/1012611596] exists,up 4d501598-7c3f-43ec-a230-a09752bf28e7 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:617: TEST_auto_repair_bluestore_failed_norecov: create_pool testpool 1 1 2026-03-08T22:51:24.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 2026-03-08T22:51:25.070 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T22:51:25.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:51:26.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:618: TEST_auto_repair_bluestore_failed_norecov: ceph osd pool set testpool size 2 2026-03-08T22:51:26.276 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:619: TEST_auto_repair_bluestore_failed_norecov: wait_for_clean 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:26.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:26.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:26.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:26.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:26.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:26.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:26.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:26.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:51:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:51:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:51:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:26.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:26.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T22:51:26.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T22:51:26.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963' 2026-03-08T22:51:26.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:26.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:26.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T22:51:26.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T22:51:26.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963 2-64424509442' 2026-03-08T22:51:26.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:26.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:51:26.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:26.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:26.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:51:26.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:26.767 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:51:26.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:51:26.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:51:26.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:26.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836485 2026-03-08T22:51:26.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:51:27.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:51:27.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:28.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T22:51:28.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:28.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T22:51:28.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:28.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:28.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T22:51:28.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:28.099 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:51:28.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T22:51:28.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T22:51:28.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:28.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T22:51:28.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:28.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T22:51:28.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:28.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:28.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T22:51:28.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:28.265 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:51:28.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T22:51:28.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T22:51:28.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:28.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T22:51:28.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:28.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:28.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:28.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:51:28.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:28.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:28.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:28.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:28.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:28.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:28.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:28.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:51:28.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:28.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:28.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:622: TEST_auto_repair_bluestore_failed_norecov: local payload=ABCDEF 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:623: TEST_auto_repair_bluestore_failed_norecov: echo ABCDEF 2026-03-08T22:51:29.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: seq 1 10 2026-03-08T22:51:29.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj1 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj2 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj3 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj4 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj5 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj6 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj7 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj8 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj9 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:624: TEST_auto_repair_bluestore_failed_norecov: for i in $(seq 1 10) 2026-03-08T22:51:29.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:626: TEST_auto_repair_bluestore_failed_norecov: rados --pool testpool put obj10 td/osd-scrub-repair/ORIGINAL 2026-03-08T22:51:29.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:632: TEST_auto_repair_bluestore_failed_norecov: get_not_primary testpool SOMETHING 2026-03-08T22:51:29.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:51:29.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:51:29.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:51:29.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:51:29.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:51:29.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:29.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:51:29.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:51:29.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:29.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:632: TEST_auto_repair_bluestore_failed_norecov: objectstore_tool td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:29.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 obj1 remove 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:51:29.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj1 remove 2026-03-08T22:51:30.380 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:30.914 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:30.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:51:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:51:30.916 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:51:30.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:30.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:51:30.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:51:30.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:51:30.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:51:30.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:51:30.935 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:30.928+0000 7f68b59078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:30.935 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:30.932+0000 7f68b59078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:30.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:30.932+0000 7f68b59078c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:31.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:31.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:32.135 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:32.132+0000 7f68b59078c0 -1 Falling back to public interface 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:32.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:33.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:33.108+0000 7f68b59078c0 -1 osd.0 21 log_to_monitors true 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:33.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 25 up_thru 0 down_at 22 last_clean_interval [5,21) [v2:127.0.0.1:6802/51887793,v1:127.0.0.1:6803/51887793] [v2:127.0.0.1:6804/51887793,v1:127.0.0.1:6805/51887793] exists,up 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:34.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:34.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:34.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:35.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:35.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182402 2026-03-08T22:51:35.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182402 2026-03-08T22:51:35.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182402' 2026-03-08T22:51:35.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:35.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:35.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672966 2026-03-08T22:51:35.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672966 2026-03-08T22:51:35.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182402 1-42949672966' 2026-03-08T22:51:35.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:35.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:35.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T22:51:35.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T22:51:35.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182402 1-42949672966 2-64424509444' 2026-03-08T22:51:35.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:35.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182402 2026-03-08T22:51:35.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:35.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:35.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182402 2026-03-08T22:51:35.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:35.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182402 2026-03-08T22:51:35.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182402' 2026-03-08T22:51:35.306 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182402 2026-03-08T22:51:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:35.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182402 -lt 107374182402 2026-03-08T22:51:35.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:35.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672966 2026-03-08T22:51:35.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:35.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:35.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672966 2026-03-08T22:51:35.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:35.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672966 2026-03-08T22:51:35.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672966' 2026-03-08T22:51:35.480 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672966 2026-03-08T22:51:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:35.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672966 2026-03-08T22:51:35.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:35.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T22:51:35.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:35.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:35.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T22:51:35.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:35.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T22:51:35.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T22:51:35.649 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509444 2026-03-08T22:51:35.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:35.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509444 2026-03-08T22:51:35.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:35.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:35.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:36.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:36.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:51:36.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:36.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:36.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:633: TEST_auto_repair_bluestore_failed_norecov: get_primary testpool SOMETHING 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:51:36.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:51:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:51:36.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:633: TEST_auto_repair_bluestore_failed_norecov: objectstore_tool td/osd-scrub-repair 1 obj1 rm-attr _ 2026-03-08T22:51:36.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:51:36.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 obj1 rm-attr _ 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:36.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 obj1 rm-attr _ 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:51:36.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj1 rm-attr _ 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:38.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:38.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:51:38.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:51:38.373 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:51:38.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:38.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:51:38.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:51:38.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:51:38.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:51:38.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:51:38.394 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:38.388+0000 7f4f52b4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:38.394 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:38.388+0000 7f4f52b4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:38.395 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:38.392+0000 7f4f52b4b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:38.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:51:38.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:38.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:38.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:39.353 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:39.348+0000 7f4f52b4b8c0 -1 Falling back to public interface 2026-03-08T22:51:39.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:39.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:39.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:39.740 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:39.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:39.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:39.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:40.323 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:40.320+0000 7f4f52b4b8c0 -1 osd.1 26 log_to_monitors true 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:40.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:41.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:41.695 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:41.692+0000 7f4f49afb640 -1 osd.1 26 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:51:42.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:42.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:42.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:42.108 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:51:42.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:42.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 30 up_thru 30 down_at 27 last_clean_interval [10,26) [v2:127.0.0.1:6810/289042923,v1:127.0.0.1:6811/289042923] [v2:127.0.0.1:6812/289042923,v1:127.0.0.1:6813/289042923] exists,up bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:42.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:42.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:42.532 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:42.532 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:42.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:42.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182404 2026-03-08T22:51:42.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182404 2026-03-08T22:51:42.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182404' 2026-03-08T22:51:42.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:42.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018882 2026-03-08T22:51:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018882 2026-03-08T22:51:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182404 1-128849018882' 2026-03-08T22:51:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:42.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509447 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509447 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182404 1-128849018882 2-64424509447' 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182404 2026-03-08T22:51:42.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:42.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:42.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182404 2026-03-08T22:51:42.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:42.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182404 2026-03-08T22:51:42.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182404' 2026-03-08T22:51:42.766 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182404 2026-03-08T22:51:42.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:42.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182404 2026-03-08T22:51:42.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:51:43.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:51:43.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:44.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182405 -lt 107374182404 2026-03-08T22:51:44.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:44.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-128849018882 2026-03-08T22:51:44.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:44.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:44.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-128849018882 2026-03-08T22:51:44.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:44.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018882 2026-03-08T22:51:44.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 128849018882' 2026-03-08T22:51:44.100 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 128849018882 2026-03-08T22:51:44.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:44.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018882 -lt 128849018882 2026-03-08T22:51:44.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:44.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509447 2026-03-08T22:51:44.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:44.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:44.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509447 2026-03-08T22:51:44.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:44.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509447 2026-03-08T22:51:44.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509447' 2026-03-08T22:51:44.267 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509447 2026-03-08T22:51:44.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:44.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509447 -lt 64424509447 2026-03-08T22:51:44.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:44.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:44.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:44.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:44.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:51:44.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:44.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:44.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:45.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:51:45.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:45.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:635: TEST_auto_repair_bluestore_failed_norecov: get_not_primary testpool SOMETHING 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:45.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:51:45.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:51:45.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:45.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:51:45.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:635: TEST_auto_repair_bluestore_failed_norecov: objectstore_tool td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:45.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 obj2 remove 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:51:45.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj2 remove 2026-03-08T22:51:46.314 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:104778fc:::obj2:head# 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:51:46.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:46.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:51:46.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:51:46.850 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:51:46.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:46.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:51:46.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:51:46.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:51:46.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:51:46.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:51:46.870 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:46.868+0000 7fe7cb9bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:46.870 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:46.868+0000 7fe7cb9bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:46.872 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:46.872+0000 7fe7cb9bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:47.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:47.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:48.076 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:48.076+0000 7fe7cb9bb8c0 -1 Falling back to public interface 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:48.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:48.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:49.306 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:49.304+0000 7fe7cb9bb8c0 -1 osd.0 31 log_to_monitors true 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:49.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:49.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:50.291 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:50.288+0000 7fe7c296b640 -1 osd.0 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:50.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:51:50.744 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 35 up_thru 27 down_at 32 last_clean_interval [25,31) [v2:127.0.0.1:6802/2697771231,v1:127.0.0.1:6803/2697771231] [v2:127.0.0.1:6804/2697771231,v1:127.0.0.1:6805/2697771231] exists,up 11d5b76a-49b0-44c5-bc6b-7ed443d1cea4 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:50.745 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:50.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:50.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:50.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:50.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:50.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:50.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:50.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:50.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:50.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:50.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:50.980 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:50.981 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:50.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:50.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:50.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:51.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855362 2026-03-08T22:51:51.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855362 2026-03-08T22:51:51.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362' 2026-03-08T22:51:51.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:51.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:51.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018885 2026-03-08T22:51:51.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018885 2026-03-08T22:51:51.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362 1-128849018885' 2026-03-08T22:51:51.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:51.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509450 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509450 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362 1-128849018885 2-64424509450' 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855362 2026-03-08T22:51:51.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:51.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855362 2026-03-08T22:51:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:51.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855362 2026-03-08T22:51:51.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855362' 2026-03-08T22:51:51.230 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 150323855362 2026-03-08T22:51:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:51.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855362 -lt 150323855362 2026-03-08T22:51:51.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:51.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-128849018885 2026-03-08T22:51:51.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:51.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:51.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-128849018885 2026-03-08T22:51:51.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:51.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018885 2026-03-08T22:51:51.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 128849018885' 2026-03-08T22:51:51.399 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 128849018885 2026-03-08T22:51:51.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:51.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018885 -lt 128849018885 2026-03-08T22:51:51.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:51.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509450 2026-03-08T22:51:51.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:51.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:51.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509450 2026-03-08T22:51:51.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:51.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509450 2026-03-08T22:51:51.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509450' 2026-03-08T22:51:51.570 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509450 2026-03-08T22:51:51.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:51.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509450 -lt 64424509450 2026-03-08T22:51:51.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:51.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:51.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:51.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:51:51.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:51:51.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:51:52.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:51:52.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:51:52.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:52.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:636: TEST_auto_repair_bluestore_failed_norecov: get_primary testpool SOMETHING 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:51:52.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:636: TEST_auto_repair_bluestore_failed_norecov: objectstore_tool td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:51:52.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:51:52.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:51:52.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:51:52.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:51:52.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 obj2 rm-attr _ 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:51:52.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj2 rm-attr _ 2026-03-08T22:51:53.236 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:ff7b1f36:::obj1:head#, (61) No data available 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:51:53.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:51:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:51:53.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:51:53.773 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:51:53.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:51:53.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:51:53.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:51:53.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:51:53.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:51:53.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:51:53.792 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:53.792+0000 7f11264128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.793 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:53.792+0000 7f11264128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.795 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:53.792+0000 7f11264128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:53.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:51:53.970 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:51:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:54.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:54.752 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:54.752+0000 7f11264128c0 -1 Falling back to public interface 2026-03-08T22:51:55.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:55.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:55.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:51:55.156 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:55.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:55.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:55.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:55.746 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:51:55.744+0000 7f11264128c0 -1 osd.1 36 log_to_monitors true 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:56.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:51:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 40 up_thru 40 down_at 37 last_clean_interval [30,36) [v2:127.0.0.1:6810/863692182,v1:127.0.0.1:6811/863692182] [v2:127.0.0.1:6812/863692182,v1:127.0.0.1:6813/863692182] exists,up bb3bfe22-f830-4dbb-9f8e-43e0d214a8d1 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:51:57.694 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:51:57.695 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:51:57.695 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:51:57.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:51:57.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:51:57.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:51:57.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:57.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:51:58.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855364 2026-03-08T22:51:58.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855364 2026-03-08T22:51:58.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364' 2026-03-08T22:51:58.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:58.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:51:58.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691842 2026-03-08T22:51:58.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691842 2026-03-08T22:51:58.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364 1-171798691842' 2026-03-08T22:51:58.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:51:58.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509452 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509452 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364 1-171798691842 2-64424509452' 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855364 2026-03-08T22:51:58.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:58.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:51:58.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855364 2026-03-08T22:51:58.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:58.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855364 2026-03-08T22:51:58.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855364' 2026-03-08T22:51:58.205 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 150323855364 2026-03-08T22:51:58.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:58.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855363 -lt 150323855364 2026-03-08T22:51:58.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:51:59.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:51:59.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:51:59.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855364 -lt 150323855364 2026-03-08T22:51:59.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:59.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-171798691842 2026-03-08T22:51:59.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:59.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:51:59.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-171798691842 2026-03-08T22:51:59.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:59.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691842 2026-03-08T22:51:59.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 171798691842' 2026-03-08T22:51:59.587 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 171798691842 2026-03-08T22:51:59.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:51:59.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691842 -lt 171798691842 2026-03-08T22:51:59.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:51:59.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509452 2026-03-08T22:51:59.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:51:59.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:51:59.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509452 2026-03-08T22:51:59.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:51:59.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509452 2026-03-08T22:51:59.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509452' 2026-03-08T22:51:59.758 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509452 2026-03-08T22:51:59.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:51:59.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509452 -lt 64424509452 2026-03-08T22:51:59.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:51:59.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:51:59.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:00.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:00.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:52:00.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:00.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:00.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:00.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:52:00.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:00.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:00.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:637: TEST_auto_repair_bluestore_failed_norecov: ceph tell 'osd.*' config set osd_scrub_auto_repair true 2026-03-08T22:52:00.610 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T22:52:00.610 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:52:00.610 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:52:00.617 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T22:52:00.617 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:52:00.617 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:52:00.624 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T22:52:00.624 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_auto_repair = '' (not observed, change may require restart) osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T22:52:00.624 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:52:00.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:639: TEST_auto_repair_bluestore_failed_norecov: get_pg testpool obj1 2026-03-08T22:52:00.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T22:52:00.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=obj1 2026-03-08T22:52:00.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool obj1 2026-03-08T22:52:00.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:639: TEST_auto_repair_bluestore_failed_norecov: local pgid=1.0 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:640: TEST_auto_repair_bluestore_failed_norecov: get_primary testpool obj1 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T22:52:00.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:640: TEST_auto_repair_bluestore_failed_norecov: local primary=1 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:641: TEST_auto_repair_bluestore_failed_norecov: get_last_scrub_stamp 1.0 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:00.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:01.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:641: TEST_auto_repair_bluestore_failed_norecov: local last_scrub_stamp=2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:01.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:642: TEST_auto_repair_bluestore_failed_norecov: ceph tell 1.0 schedule-deep-scrub 2026-03-08T22:52:01.216 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:52:01.216 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T22:52:01.216 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T22:52:01.216 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-22T22:50:21.217811+0000" 2026-03-08T22:52:01.216 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:645: TEST_auto_repair_bluestore_failed_norecov: wait_for_scrub 1.0 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:01.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:01.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:02.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:02.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:02.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:03.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:03.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:03.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:03.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:03.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:03.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:03.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:03.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:03.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:04.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:04.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:04.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:04.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:04.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:04.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:04.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:04.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:04.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:05.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:06.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:06.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:07.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:07.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:07.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:51:25.070829+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:07.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:08.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:01.681066+0000 '>' 2026-03-08T22:51:25.070829+0000 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:646: TEST_auto_repair_bluestore_failed_norecov: wait_for_clean 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:52:08.412 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:52:08.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:52:08.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:52:08.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:52:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:52:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:52:08.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:52:08.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:52:08.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:08.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:08.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:08.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855367 2026-03-08T22:52:08.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855367 2026-03-08T22:52:08.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855367' 2026-03-08T22:52:08.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:08.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:08.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691845 2026-03-08T22:52:08.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691845 2026-03-08T22:52:08.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855367 1-171798691845' 2026-03-08T22:52:08.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:08.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:08.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509455 2026-03-08T22:52:08.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509455 2026-03-08T22:52:08.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855367 1-171798691845 2-64424509455' 2026-03-08T22:52:08.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:08.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855367 2026-03-08T22:52:08.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:08.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:08.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855367 2026-03-08T22:52:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:08.887 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 150323855367 2026-03-08T22:52:08.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855367 2026-03-08T22:52:08.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855367' 2026-03-08T22:52:08.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:09.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855366 -lt 150323855367 2026-03-08T22:52:09.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:10.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:10.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:10.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855367 -lt 150323855367 2026-03-08T22:52:10.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:10.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-171798691845 2026-03-08T22:52:10.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:10.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:10.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-171798691845 2026-03-08T22:52:10.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:10.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691845 2026-03-08T22:52:10.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 171798691845' 2026-03-08T22:52:10.230 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 171798691845 2026-03-08T22:52:10.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:10.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691845 -lt 171798691845 2026-03-08T22:52:10.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:10.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509455 2026-03-08T22:52:10.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:10.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:10.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509455 2026-03-08T22:52:10.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:10.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509455 2026-03-08T22:52:10.399 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509455 2026-03-08T22:52:10.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509455' 2026-03-08T22:52:10.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:10.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509455 -lt 64424509455 2026-03-08T22:52:10.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:10.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:10.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:10.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:52:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:10.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:647: TEST_auto_repair_bluestore_failed_norecov: flush_pg_stats 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:11.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:11.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:11.306 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:52:11.306 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:52:11.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:11.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:11.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:11.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855369 2026-03-08T22:52:11.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855369 2026-03-08T22:52:11.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855369' 2026-03-08T22:52:11.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:11.388 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:11.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691847 2026-03-08T22:52:11.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691847 2026-03-08T22:52:11.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855369 1-171798691847' 2026-03-08T22:52:11.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:11.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:11.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509457 2026-03-08T22:52:11.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509457 2026-03-08T22:52:11.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855369 1-171798691847 2-64424509457' 2026-03-08T22:52:11.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:11.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855369 2026-03-08T22:52:11.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:11.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:11.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855369 2026-03-08T22:52:11.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:11.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855369 2026-03-08T22:52:11.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855369' 2026-03-08T22:52:11.559 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 150323855369 2026-03-08T22:52:11.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:11.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855368 -lt 150323855369 2026-03-08T22:52:11.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:12.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:12.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:12.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855368 -lt 150323855369 2026-03-08T22:52:12.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:13.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:52:13.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:14.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855369 -lt 150323855369 2026-03-08T22:52:14.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:14.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-171798691847 2026-03-08T22:52:14.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:14.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:14.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-171798691847 2026-03-08T22:52:14.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:14.066 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 171798691847 2026-03-08T22:52:14.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691847 2026-03-08T22:52:14.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 171798691847' 2026-03-08T22:52:14.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:14.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691847 -lt 171798691847 2026-03-08T22:52:14.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:14.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509457 2026-03-08T22:52:14.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:14.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:14.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509457 2026-03-08T22:52:14.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:14.254 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509457 2026-03-08T22:52:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509457 2026-03-08T22:52:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509457' 2026-03-08T22:52:14.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:14.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509457 -lt 64424509457 2026-03-08T22:52:14.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:648: TEST_auto_repair_bluestore_failed_norecov: grep -q 'scrub_finish.*present with no repair possible' td/osd-scrub-repair/osd.1.log 2026-03-08T22:52:14.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:649: TEST_auto_repair_bluestore_failed_norecov: ceph pg dump pgs 2026-03-08T22:52:14.585 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:52:14.585 INFO:tasks.workunit.client.0.vm03.stdout:1.0 10 0 0 0 0 56 0 0 10 0 10 active+clean+inconsistent+failed_repair 2026-03-08T22:52:01.681115+0000 21'10 41:99 [1,0] 1 [1,0] 1 21'10 2026-03-08T22:52:01.681066+0000 21'10 2026-03-08T22:52:01.681066+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:52:01.681066+0000 10 0 2026-03-08T22:52:14.585 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:52:14.586 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:52:14.586 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:52:14.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:650: TEST_auto_repair_bluestore_failed_norecov: ceph pg dump pgs 2026-03-08T22:52:14.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:650: TEST_auto_repair_bluestore_failed_norecov: grep -q '^1.0.*+failed_repair' 2026-03-08T22:52:14.754 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:14.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:14.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:14.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:52:14.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:52:14.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:52:14.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:52:14.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:52:14.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:52:14.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:14.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:52:14.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:52:14.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:14.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:52:14.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:52:14.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:52:14.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:52:14.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:14.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:14.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:14.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:14.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:52:14.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:52:14.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:52:14.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:52:14.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:52:14.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:52:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:52:14.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:52:14.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:52:14.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:14.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:52:14.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:52:14.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:52:14.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:52:14.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:52:14.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:14.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:14.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:52:14.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:52:14.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:52:14.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:52:14.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:52:14.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_bluestore_scrub td/osd-scrub-repair 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:483: TEST_auto_repair_bluestore_scrub: local dir=td/osd-scrub-repair 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:484: TEST_auto_repair_bluestore_scrub: local poolname=testpool 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:487: TEST_auto_repair_bluestore_scrub: run_mon td/osd-scrub-repair a 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:52:14.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:52:14.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:52:14.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:52:14.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:52:14.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:52:14.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:52:14.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:52:14.981 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:52:14.981 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:52:14.981 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:52:14.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:52:15.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:52:15.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:52:15.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:52:15.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:52:15.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:488: TEST_auto_repair_bluestore_scrub: run_mgr td/osd-scrub-repair x 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:52:15.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:15.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:52:15.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:52:15.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:492: TEST_auto_repair_bluestore_scrub: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:52:15.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:493: TEST_auto_repair_bluestore_scrub: seq 0 2 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:493: TEST_auto_repair_bluestore_scrub: for id in $(seq 0 2) 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:494: TEST_auto_repair_bluestore_scrub: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:52:15.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:52:15.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:15.262 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:52:15.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:52:15.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 743e61ab-830a-48df-acf9-b12bf01a5423' 2026-03-08T22:52:15.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:15.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCf/a1pWn1TEBAAMSjtVIqTmeTDv8QqzTYcyg== 2026-03-08T22:52:15.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCf/a1pWn1TEBAAMSjtVIqTmeTDv8QqzTYcyg=="}' 2026-03-08T22:52:15.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 743e61ab-830a-48df-acf9-b12bf01a5423 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:52:15.368 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:52:15.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:52:15.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 --mkfs --key AQCf/a1pWn1TEBAAMSjtVIqTmeTDv8QqzTYcyg== --osd-uuid 743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:52:15.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:15.396+0000 7f39d2ca18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:15.408 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:15.408+0000 7f39d2ca18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:15.409 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:15.408+0000 7f39d2ca18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:15.409 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:15.408+0000 7f39d2ca18c0 -1 bdev(0x5647e80adc00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:15.409 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:15.408+0000 7f39d2ca18c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:52:17.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:52:17.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:17.741 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:52:17.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:52:17.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:17.879 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:52:17.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:52:17.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:17.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:17.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:17.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:17.898 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:17.896+0000 7f193a61d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:17.904 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:17.904+0000 7f193a61d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:17.907 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:17.904+0000 7f193a61d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:18.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:19.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:19.340 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:19.340+0000 7f193a61d8c0 -1 Falling back to public interface 2026-03-08T22:52:19.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:20.309 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:20.308+0000 7f193a61d8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:20.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:20.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:21.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:21.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:21.815 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:21.816+0000 7f1935dd6640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:52:22.812 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T22:52:22.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:22.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:22.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:52:22.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:22.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2122259173,v1:127.0.0.1:6803/2122259173] [v2:127.0.0.1:6804/2122259173,v1:127.0.0.1:6805/2122259173] exists,up 743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:493: TEST_auto_repair_bluestore_scrub: for id in $(seq 0 2) 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:494: TEST_auto_repair_bluestore_scrub: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:52:22.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:22.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:52:22.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:22.982 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 ed3098c5-62c2-4222-8efc-ed3f305faaea 2026-03-08T22:52:22.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ed3098c5-62c2-4222-8efc-ed3f305faaea 2026-03-08T22:52:22.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 ed3098c5-62c2-4222-8efc-ed3f305faaea' 2026-03-08T22:52:22.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:22.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCm/a1pxVpFOxAAQLpZU/VmNcBN2hhxMfGMdg== 2026-03-08T22:52:22.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCm/a1pxVpFOxAAQLpZU/VmNcBN2hhxMfGMdg=="}' 2026-03-08T22:52:22.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ed3098c5-62c2-4222-8efc-ed3f305faaea -i td/osd-scrub-repair/1/new.json 2026-03-08T22:52:23.160 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:52:23.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:52:23.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 --mkfs --key AQCm/a1pxVpFOxAAQLpZU/VmNcBN2hhxMfGMdg== --osd-uuid ed3098c5-62c2-4222-8efc-ed3f305faaea 2026-03-08T22:52:23.192 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:23.192+0000 7f37d03ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:23.194 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:23.192+0000 7f37d03ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:23.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:23.196+0000 7f37d03ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:23.196 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:23.196+0000 7f37d03ec8c0 -1 bdev(0x55aeb1403c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:23.196 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:23.196+0000 7f37d03ec8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:52:25.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:52:25.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:25.453 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:52:25.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:52:25.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:25.656 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:52:25.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:52:25.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:25.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:25.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:25.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:25.674 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:25.672+0000 7fbc800a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.674 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:25.672+0000 7fbc800a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.676 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:25.676+0000 7fbc800a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:25.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:26.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:26.628 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:26.628+0000 7fbc800a18c0 -1 Falling back to public interface 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:27.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:27.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:27.596 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:27.596+0000 7fbc800a18c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:52:28.203 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:52:28.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:28.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:28.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:28.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:28.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:28.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:28.669 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:28.668+0000 7fbc7b85a640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:52:29.391 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:52:29.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:29.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:29.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:29.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:29.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:52:29.566 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1425404427,v1:127.0.0.1:6811/1425404427] [v2:127.0.0.1:6812/1425404427,v1:127.0.0.1:6813/1425404427] exists,up ed3098c5-62c2-4222-8efc-ed3f305faaea 2026-03-08T22:52:29.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:493: TEST_auto_repair_bluestore_scrub: for id in $(seq 0 2) 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:494: TEST_auto_repair_bluestore_scrub: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:29.567 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:52:29.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:52:29.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:52:29.570 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e 2026-03-08T22:52:29.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e 2026-03-08T22:52:29.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e' 2026-03-08T22:52:29.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:52:29.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCt/a1pq0jKIhAAFboZkGrl3UwbieS43PTQXA== 2026-03-08T22:52:29.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCt/a1pq0jKIhAAFboZkGrl3UwbieS43PTQXA=="}' 2026-03-08T22:52:29.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e -i td/osd-scrub-repair/2/new.json 2026-03-08T22:52:30.162 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:52:30.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:52:30.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 --mkfs --key AQCt/a1pq0jKIhAAFboZkGrl3UwbieS43PTQXA== --osd-uuid 3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e 2026-03-08T22:52:30.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:30.196+0000 7fc8fbb398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:30.199 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:30.200+0000 7fc8fbb398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:30.200 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:30.200+0000 7fc8fbb398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:30.200 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:30.200+0000 7fc8fbb398c0 -1 bdev(0x56338bf0bc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:52:30.200 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:30.200+0000 7fc8fbb398c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:52:32.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:52:32.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:52:32.714 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:52:32.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:52:32.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:52:32.938 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:52:32.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:52:32.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:32.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:52:32.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:52:32.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:52:32.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:32.952+0000 7fc15f5d28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:32.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:32.956+0000 7fc15f5d28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:32.957 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:32.956+0000 7fc15f5d28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:33.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:33.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:34.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:34.164+0000 7fc15f5d28c0 -1 Falling back to public interface 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:34.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:34.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:35.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:35.148+0000 7fc15f5d28c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:52:35.544 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:52:35.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:35.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:35.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:35.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:35.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:36.742 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:52:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:52:36.909 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2430628352,v1:127.0.0.1:6819/2430628352] [v2:127.0.0.1:6820/2430628352,v1:127.0.0.1:6821/2430628352] exists,up 3eb3fdd5-6cb9-44cc-b5d1-87efee29d51e 2026-03-08T22:52:36.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:36.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:36.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:36.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:497: TEST_auto_repair_bluestore_scrub: create_pool testpool 1 1 2026-03-08T22:52:36.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 2026-03-08T22:52:37.133 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T22:52:37.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:52:38.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:498: TEST_auto_repair_bluestore_scrub: ceph osd pool set testpool size 2 2026-03-08T22:52:38.407 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T22:52:38.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:499: TEST_auto_repair_bluestore_scrub: wait_for_clean 2026-03-08T22:52:38.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:52:38.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:52:38.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:52:38.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:52:38.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:52:38.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:52:38.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:52:38.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:52:38.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:38.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:38.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:38.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:52:38.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:52:38.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:52:38.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:38.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:38.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T22:52:38.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T22:52:38.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T22:52:38.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:38.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:38.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T22:52:38.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T22:52:38.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964 2-64424509442' 2026-03-08T22:52:38.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:38.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:52:38.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:38.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:38.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:52:38.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:38.937 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:52:38.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:52:38.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:52:38.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:39.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T22:52:39.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:40.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:40.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:40.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:40.290 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T22:52:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T22:52:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T22:52:40.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:40.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T22:52:40.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:40.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T22:52:40.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:40.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:40.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T22:52:40.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:40.481 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:52:40.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T22:52:40.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T22:52:40.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:40.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T22:52:40.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:40.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:40.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:40.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:52:40.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:40.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:41.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:52:41.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:41.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:41.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' -1 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1674: wait_for_clean: loop=0 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1675: wait_for_clean: num_active_clean=0 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:52:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.1 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:41.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:41.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:52:41.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:41.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:41.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:52:41.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:52:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:52:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:52:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 1 >= 13 )) 2026-03-08T22:52:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:52:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.2 2026-03-08T22:52:42.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:52:42.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:42.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:42.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:52:42.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:42.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:42.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:52:42.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:52:42.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:52:42.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:52:42.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:52:42.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:52:42.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 2 >= 13 )) 2026-03-08T22:52:42.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:52:42.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.4 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:43.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:43.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=0 2026-03-08T22:52:43.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:43.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:43.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:43.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 0 = 1 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1673: wait_for_clean: test 0 '!=' 0 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1676: wait_for_clean: get_is_making_recovery_progress 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1334: get_is_making_recovery_progress: local recovery_progress 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1335: get_is_making_recovery_progress: recovery_progress+='.recovering_keys_per_sec + ' 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1336: get_is_making_recovery_progress: recovery_progress+='.recovering_bytes_per_sec + ' 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1337: get_is_making_recovery_progress: recovery_progress+=.recovering_objects_per_sec 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: ceph --format json status 2026-03-08T22:52:43.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: jq -r '.pgmap | .recovering_keys_per_sec + .recovering_bytes_per_sec + .recovering_objects_per_sec' 2026-03-08T22:52:43.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1339: get_is_making_recovery_progress: local progress=null 2026-03-08T22:52:43.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1340: get_is_making_recovery_progress: test null '!=' null 2026-03-08T22:52:43.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1678: wait_for_clean: (( 3 >= 13 )) 2026-03-08T22:52:43.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1683: wait_for_clean: eval 2026-03-08T22:52:43.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1684: wait_for_clean: sleep 0.8 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1685: wait_for_clean: loop+=1 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:44.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:44.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:52:44.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:44.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:44.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:502: TEST_auto_repair_bluestore_scrub: local payload=ABCDEF 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:503: TEST_auto_repair_bluestore_scrub: echo ABCDEF 2026-03-08T22:52:44.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:504: TEST_auto_repair_bluestore_scrub: rados --pool testpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:508: TEST_auto_repair_bluestore_scrub: get_not_primary testpool SOMETHING 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:52:44.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:52:45.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:52:45.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:52:45.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:52:45.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:508: TEST_auto_repair_bluestore_scrub: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:52:45.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:52:45.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:52:45.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:52:45.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:52:45.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T22:52:46.286 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:52:46.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:52:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:52:46.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:52:46.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:52:46.822 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:52:46.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:52:46.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:52:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:52:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:52:46.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:52:46.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:52:46.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:46.835+0000 7efd4a3678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:46.844 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:46.843+0000 7efd4a3678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:46.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:46.843+0000 7efd4a3678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:47.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:47.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:48.041 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:48.039+0000 7efd4a3678c0 -1 Falling back to public interface 2026-03-08T22:52:48.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:48.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:48.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:52:48.174 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:52:48.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:48.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:48.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:49.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:52:49.139+0000 7efd4a3678c0 -1 osd.0 20 log_to_monitors true 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:49.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:49.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:52:50.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:52:50.771 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 0 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/3845227665,v1:127.0.0.1:6803/3845227665] [v2:127.0.0.1:6804/3845227665,v1:127.0.0.1:6805/3845227665] exists,up 743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:52:50.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:52:50.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:52:50.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:52:50.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:52:50.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:52:51.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:52:51.035 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:52:51.036 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:52:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:52:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:51.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:52:51.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T22:52:51.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T22:52:51.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T22:52:51.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:51.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:52:51.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672967 2026-03-08T22:52:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672967 2026-03-08T22:52:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672967' 2026-03-08T22:52:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:52:51.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:52:51.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509445 2026-03-08T22:52:51.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509445 2026-03-08T22:52:51.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672967 2-64424509445' 2026-03-08T22:52:51.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:51.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T22:52:51.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:51.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:52:51.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T22:52:51.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:51.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T22:52:51.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T22:52:51.288 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T22:52:51.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:51.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T22:52:51.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:52:52.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:52:52.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:52:52.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T22:52:52.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:52.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672967 2026-03-08T22:52:52.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:52.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:52:52.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672967 2026-03-08T22:52:52.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:52.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672967 2026-03-08T22:52:52.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672967' 2026-03-08T22:52:52.658 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672967 2026-03-08T22:52:52.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:52:52.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672967 -lt 42949672967 2026-03-08T22:52:52.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:52:52.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509445 2026-03-08T22:52:52.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:52:52.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:52:52.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509445 2026-03-08T22:52:52.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:52:52.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509445 2026-03-08T22:52:52.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509445' 2026-03-08T22:52:52.841 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509445 2026-03-08T22:52:52.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:52:53.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509445 2026-03-08T22:52:53.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:52:53.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:53.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:52:53.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:52:53.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:52:53.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:52:53.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:52:53.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:52:53.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:52:53.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:52:53.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:52:53.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:510: TEST_auto_repair_bluestore_scrub: get_pg testpool SOMETHING 2026-03-08T22:52:53.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T22:52:53.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:52:53.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool SOMETHING 2026-03-08T22:52:53.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:510: TEST_auto_repair_bluestore_scrub: local pgid=1.0 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:511: TEST_auto_repair_bluestore_scrub: get_primary testpool SOMETHING 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:52:53.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:52:53.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:511: TEST_auto_repair_bluestore_scrub: local primary=1 2026-03-08T22:52:53.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:512: TEST_auto_repair_bluestore_scrub: get_last_scrub_stamp 1.0 2026-03-08T22:52:53.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:53.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:53.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:54.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:512: TEST_auto_repair_bluestore_scrub: local last_scrub_stamp=2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:54.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:513: TEST_auto_repair_bluestore_scrub: ceph tell 1.0 schedule-scrub 2026-03-08T22:52:54.230 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T22:52:54.231 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T22:52:54.231 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T22:52:54.231 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T22:51:14.231514+0000" 2026-03-08T22:52:54.231 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:516: TEST_auto_repair_bluestore_scrub: wait_for_scrub 1.0 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:54.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:54.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:54.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:54.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:37.132999+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:55.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:55.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:55.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:37.132999+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:55.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:56.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:56.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:56.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:37.132999+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:56.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:57.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:57.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:57.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:57.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:57.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:57.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:57.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:57.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:37.132999+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:57.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:52:58.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:52:59.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:37.132999+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:52:59.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:53:00.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:53:00.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:52:55.668878+0000 '>' 2026-03-08T22:52:37.132999+0000 2026-03-08T22:53:00.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:53:00.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:517: TEST_auto_repair_bluestore_scrub: ceph pg dump pgs 2026-03-08T22:53:00.482 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:53:00.482 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean 2026-03-08T22:52:55.671433+0000 20'1 25:66 [1,0] 1 [1,0] 1 20'1 2026-03-08T22:52:55.668878+0000 20'1 2026-03-08T22:52:55.668878+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:52:55.668878+0000 1 0 2026-03-08T22:53:00.483 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:53:00.483 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:53:00.483 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:53:00.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:519: TEST_auto_repair_bluestore_scrub: sleep 2 2026-03-08T22:53:02.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:520: TEST_auto_repair_bluestore_scrub: ceph pg dump pgs 2026-03-08T22:53:02.653 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:53:02.653 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean 2026-03-08T22:52:55.671433+0000 20'1 25:66 [1,0] 1 [1,0] 1 20'1 2026-03-08T22:52:55.668878+0000 20'1 2026-03-08T22:52:55.668878+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:52:55.668878+0000 1 0 2026-03-08T22:53:02.653 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:53:02.653 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:53:02.653 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:53:02.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:521: TEST_auto_repair_bluestore_scrub: sleep 2 2026-03-08T22:53:04.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:522: TEST_auto_repair_bluestore_scrub: ceph pg dump pgs 2026-03-08T22:53:04.818 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:53:04.818 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean 2026-03-08T22:52:55.671433+0000 20'1 25:66 [1,0] 1 [1,0] 1 20'1 2026-03-08T22:52:55.668878+0000 20'1 2026-03-08T22:52:55.668878+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:52:55.668878+0000 1 0 2026-03-08T22:53:04.818 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:53:04.818 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:53:04.818 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:53:04.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:523: TEST_auto_repair_bluestore_scrub: sleep 5 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:524: TEST_auto_repair_bluestore_scrub: wait_for_clean 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:53:09.831 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:53:09.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:53:09.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:53:09.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:09.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:10.062 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:10.062 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:10.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:10.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:10.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215111 2026-03-08T22:53:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215111 2026-03-08T22:53:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215111' 2026-03-08T22:53:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:10.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:10.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672972 2026-03-08T22:53:10.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672972 2026-03-08T22:53:10.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215111 1-42949672972' 2026-03-08T22:53:10.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:10.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509450 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509450 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215111 1-42949672972 2-64424509450' 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215111 2026-03-08T22:53:10.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:10.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215111 2026-03-08T22:53:10.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215111 2026-03-08T22:53:10.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215111' 2026-03-08T22:53:10.297 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 103079215111 2026-03-08T22:53:10.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:10.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215110 -lt 103079215111 2026-03-08T22:53:10.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:11.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:11.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:11.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215110 -lt 103079215111 2026-03-08T22:53:11.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:12.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:53:12.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:12.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215111 -lt 103079215111 2026-03-08T22:53:12.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:12.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672972 2026-03-08T22:53:12.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:12.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:12.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672972 2026-03-08T22:53:12.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:12.815 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672972 2026-03-08T22:53:12.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672972 2026-03-08T22:53:12.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672972' 2026-03-08T22:53:12.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:12.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672972 -lt 42949672972 2026-03-08T22:53:12.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:12.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509450 2026-03-08T22:53:12.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:12.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:12.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:12.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509450 2026-03-08T22:53:12.991 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509450 2026-03-08T22:53:12.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509450 2026-03-08T22:53:12.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509450' 2026-03-08T22:53:12.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:13.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509450 -lt 64424509450 2026-03-08T22:53:13.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:53:13.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:13.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:53:13.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:53:13.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:53:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:53:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:13.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:53:13.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:53:13.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:53:13.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:525: TEST_auto_repair_bluestore_scrub: ceph pg dump pgs 2026-03-08T22:53:13.905 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:53:13.905 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean 2026-03-08T22:52:55.671433+0000 20'1 25:66 [1,0] 1 [1,0] 1 20'1 2026-03-08T22:52:55.668878+0000 20'1 2026-03-08T22:52:55.668878+0000 0 1 periodic scrub scheduled @ 2026-03-09T22:52:55.668878+0000 1 0 2026-03-08T22:53:13.905 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:53:13.905 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:53:13.905 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:528: TEST_auto_repair_bluestore_scrub: get_not_primary testpool SOMETHING 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:53:13.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:53:13.919 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:13.919 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:53:14.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:53:14.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:53:14.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:528: TEST_auto_repair_bluestore_scrub: objectstore_tool td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:53:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING list-attrs 2026-03-08T22:53:14.710 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:53:14.710 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:14.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0' 2026-03-08T22:53:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:53:15.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:53:15.000 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:53:15.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-scrub-backoff-ratio=0 2026-03-08T22:53:15.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:53:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:53:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:53:15.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:53:15.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:53:15.021 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:15.015+0000 7fe20a8058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:15.021 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:15.019+0000 7fe20a8058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:15.023 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:15.019+0000 7fe20a8058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:15.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:15.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:15.968 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:15.967+0000 7fe20a8058c0 -1 Falling back to public interface 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:16.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:16.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:17.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:17.107+0000 7fe20a8058c0 -1 osd.0 25 log_to_monitors true 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:17.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:17.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:18.369 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:18.367+0000 7fe2017b5640 -1 osd.0 25 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:18.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:18.917 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 29 up_thru 0 down_at 26 last_clean_interval [24,25) [v2:127.0.0.1:6802/1809730852,v1:127.0.0.1:6803/1809730852] [v2:127.0.0.1:6804/1809730852,v1:127.0.0.1:6805/1809730852] exists,up 743e61ab-830a-48df-acf9-b12bf01a5423 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:53:18.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:53:18.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:53:18.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:53:18.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:18.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:19.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:19.156 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:19.156 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:19.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:19.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051586 2026-03-08T22:53:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051586 2026-03-08T22:53:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586' 2026-03-08T22:53:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:19.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672975 2026-03-08T22:53:19.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672975 2026-03-08T22:53:19.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672975' 2026-03-08T22:53:19.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:19.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509453 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509453 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672975 2-64424509453' 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051586 2026-03-08T22:53:19.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:19.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:19.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051586 2026-03-08T22:53:19.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:19.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051586 2026-03-08T22:53:19.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051586' 2026-03-08T22:53:19.414 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 124554051586 2026-03-08T22:53:19.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:19.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 124554051586 2026-03-08T22:53:19.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:20.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:20.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:20.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051586 -lt 124554051586 2026-03-08T22:53:20.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:20.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672975 2026-03-08T22:53:20.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:20.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672975 2026-03-08T22:53:20.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672975 2026-03-08T22:53:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672975' 2026-03-08T22:53:20.773 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672975 2026-03-08T22:53:20.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:20.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672975 -lt 42949672975 2026-03-08T22:53:20.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:20.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509453 2026-03-08T22:53:20.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:20.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:20.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509453 2026-03-08T22:53:20.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509453 2026-03-08T22:53:20.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509453' 2026-03-08T22:53:20.955 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509453 2026-03-08T22:53:20.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:21.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509453 -lt 64424509453 2026-03-08T22:53:21.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:53:21.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:21.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:53:21.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:53:21.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:53:21.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:53:21.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:21.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:21.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:53:21.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:53:21.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:53:21.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:529: TEST_auto_repair_bluestore_scrub: rados --pool testpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:53:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:530: TEST_auto_repair_bluestore_scrub: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:53:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:531: TEST_auto_repair_bluestore_scrub: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:53:21.802 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:54.631+0000 7fbc6382a640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing [ 1.0: DEEP_SCRUB_ON_ERROR ] planned DEEP_SCRUB_ON_ERROR] scrubber: scrub_finish before flags: DEEP_SCRUB_ON_ERROR. repair state: no-repair. deep_scrub_on_error: 1 2026-03-08T22:53:21.802 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:54.631+0000 7fbc6382a640 15 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing [ 1.0: DEEP_SCRUB_ON_ERROR ] ] scrubber: scrub_finish Try to auto repair after scrub errors 2026-03-08T22:53:21.802 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:54.631+0000 7fbc6382a640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing [ 1.0: ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 0 2026-03-08T22:53:21.802 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:55.667+0000 7fbc6382a640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:55.667+0000 7fbc6382a640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep+inconsistent [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:55.667+0000 7fbc6382a640 15 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+inconsistent+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 1 errors. 1 errors fixed 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:55.667+0000 7fbc6382a640 20 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+inconsistent+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish All may be fixed 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:52:55.667+0000 7fbc6382a640 19 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+inconsistent+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:534: TEST_auto_repair_bluestore_scrub: ceph pg 1.0 query 2026-03-08T22:53:21.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:534: TEST_auto_repair_bluestore_scrub: jq .info.stats.stat_sum.num_objects_repaired 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:534: TEST_auto_repair_bluestore_scrub: COUNT=1 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:535: TEST_auto_repair_bluestore_scrub: test 1 = 1 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:53:21.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:53:21.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:21.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:21.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:21.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:21.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:22.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:22.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:53:22.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:53:22.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:53:22.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:53:22.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:53:22.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:53:22.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:22.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:53:22.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:53:22.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:22.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:53:22.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:53:22.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:53:22.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:53:22.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:22.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:22.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:22.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:53:22.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:53:22.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:53:22.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:53:22.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:53:22.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:53:22.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:22.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:53:22.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:53:22.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:53:22.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:53:22.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:53:22.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:53:22.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:53:22.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:53:22.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:53:22.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:53:22.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:53:22.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:53:22.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_bluestore_tag td/osd-scrub-repair 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:383: TEST_auto_repair_bluestore_tag: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:384: TEST_auto_repair_bluestore_tag: local poolname=testpool 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:387: TEST_auto_repair_bluestore_tag: run_mon td/osd-scrub-repair a 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:53:22.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:53:22.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:22.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:53:22.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:53:22.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:53:22.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:53:22.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:53:22.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:53:22.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:53:22.104 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:53:22.104 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:53:22.104 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:53:22.105 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:22.105 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.105 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.105 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:53:22.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:53:22.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:53:22.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:53:22.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:53:22.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:53:22.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:53:22.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:388: TEST_auto_repair_bluestore_tag: run_mgr td/osd-scrub-repair x 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:53:22.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:22.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:53:22.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:53:22.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:395: TEST_auto_repair_bluestore_tag: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:53:22.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:396: TEST_auto_repair_bluestore_tag: seq 0 2 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:396: TEST_auto_repair_bluestore_tag: for id in $(seq 0 2) 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:397: TEST_auto_repair_bluestore_tag: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:22.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:22.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:53:22.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:53:22.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:22.398 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:53:22.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:53:22.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 b1c5fdb9-636f-41f9-943b-d9e337462a48' 2026-03-08T22:53:22.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:22.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDi/a1pLk57GBAA+I0DSXStic7RXK2n7U7QTQ== 2026-03-08T22:53:22.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDi/a1pLk57GBAA+I0DSXStic7RXK2n7U7QTQ=="}' 2026-03-08T22:53:22.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b1c5fdb9-636f-41f9-943b-d9e337462a48 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:53:22.524 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:53:22.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:53:22.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq --mkfs --key AQDi/a1pLk57GBAA+I0DSXStic7RXK2n7U7QTQ== --osd-uuid b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:53:22.560 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:22.559+0000 7f4f30ff48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:22.564 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:22.563+0000 7f4f30ff48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:22.565 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:22.563+0000 7f4f30ff48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:22.565 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:22.563+0000 7f4f30ff48c0 -1 bdev(0x562794caac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:22.566 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:22.563+0000 7f4f30ff48c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:53:24.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:53:24.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:24.926 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:53:24.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:53:24.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:25.041 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:53:25.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:53:25.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:25.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:25.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:25.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:25.097 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:25.095+0000 7fdfb01598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:25.106 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:25.103+0000 7fdfb01598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:25.112 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:25.107+0000 7fdfb01598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:25.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:25.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:26.053 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:26.051+0000 7fdfb01598c0 -1 Falling back to public interface 2026-03-08T22:53:26.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:26.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:26.356 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:53:26.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:26.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:26.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:26.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:27.021 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.019+0000 7fdfb01598c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:27.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3153724128,v1:127.0.0.1:6803/3153724128] [v2:127.0.0.1:6804/3153724128,v1:127.0.0.1:6805/3153724128] exists,up b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:396: TEST_auto_repair_bluestore_tag: for id in $(seq 0 2) 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:397: TEST_auto_repair_bluestore_tag: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:53:27.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:27.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:53:27.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:53:27.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:27.703 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 c59cd650-7ec6-414e-9d4a-200cababa72a 2026-03-08T22:53:27.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c59cd650-7ec6-414e-9d4a-200cababa72a 2026-03-08T22:53:27.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 c59cd650-7ec6-414e-9d4a-200cababa72a' 2026-03-08T22:53:27.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:27.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDn/a1p15GZKhAAkn6EdWJPXV9dSoSjf8OOnw== 2026-03-08T22:53:27.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDn/a1p15GZKhAAkn6EdWJPXV9dSoSjf8OOnw=="}' 2026-03-08T22:53:27.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c59cd650-7ec6-414e-9d4a-200cababa72a -i td/osd-scrub-repair/1/new.json 2026-03-08T22:53:27.899 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:53:27.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:53:27.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq --mkfs --key AQDn/a1p15GZKhAAkn6EdWJPXV9dSoSjf8OOnw== --osd-uuid c59cd650-7ec6-414e-9d4a-200cababa72a 2026-03-08T22:53:27.934 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.931+0000 7f4e50b9f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:27.936 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.935+0000 7f4e50b9f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:27.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.935+0000 7f4e50b9f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:27.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.935+0000 7f4e50b9f8c0 -1 bdev(0x5632b821bc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:27.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:27.935+0000 7f4e50b9f8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:53:30.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:53:30.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:30.201 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:53:30.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:53:30.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:30.407 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:53:30.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:53:30.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:30.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:30.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:30.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:30.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:30.423+0000 7fb35c3d38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:30.426 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:30.423+0000 7fb35c3d38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:30.428 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:30.427+0000 7fb35c3d38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:30.606 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:53:30.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:53:30.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:30.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:53:30.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:30.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:31.152 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:31.151+0000 7fb35c3d38c0 -1 Falling back to public interface 2026-03-08T22:53:31.779 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:53:31.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:31.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:31.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:31.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:31.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:31.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:32.233 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:32.231+0000 7fb35c3d38c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:53:32.956 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:53:32.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:32.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:32.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:32.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:32.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:53:33.132 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3582807927,v1:127.0.0.1:6811/3582807927] [v2:127.0.0.1:6812/3582807927,v1:127.0.0.1:6813/3582807927] exists,up c59cd650-7ec6-414e-9d4a-200cababa72a 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:396: TEST_auto_repair_bluestore_tag: for id in $(seq 0 2) 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:397: TEST_auto_repair_bluestore_tag: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:33.133 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:53:33.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:53:33.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:53:33.136 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 34780a38-258e-4357-b763-7646e1e3c02f 2026-03-08T22:53:33.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=34780a38-258e-4357-b763-7646e1e3c02f 2026-03-08T22:53:33.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 34780a38-258e-4357-b763-7646e1e3c02f' 2026-03-08T22:53:33.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:53:33.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDt/a1pnZnXCBAAfrcrkxej0JKjv7yu6rqS+A== 2026-03-08T22:53:33.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDt/a1pnZnXCBAAfrcrkxej0JKjv7yu6rqS+A=="}' 2026-03-08T22:53:33.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 34780a38-258e-4357-b763-7646e1e3c02f -i td/osd-scrub-repair/2/new.json 2026-03-08T22:53:33.377 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:53:33.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:53:33.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq --mkfs --key AQDt/a1pnZnXCBAAfrcrkxej0JKjv7yu6rqS+A== --osd-uuid 34780a38-258e-4357-b763-7646e1e3c02f 2026-03-08T22:53:33.409 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:33.407+0000 7f4af91b98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:33.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:33.411+0000 7f4af91b98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:33.412 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:33.411+0000 7f4af91b98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:33.413 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:33.411+0000 7f4af91b98c0 -1 bdev(0x564923b1bc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:53:33.413 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:33.411+0000 7f4af91b98c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:53:35.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:53:35.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:53:35.729 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:53:35.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:53:35.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:53:35.938 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:53:35.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:53:35.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:35.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:53:35.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:53:35.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:53:35.957 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:35.955+0000 7fc7c0c808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:35.961 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:35.959+0000 7fc7c0c808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:35.967 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:35.959+0000 7fc7c0c808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:36.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:36.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:37.168 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:37.167+0000 7fc7c0c808c0 -1 Falling back to public interface 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:37.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:37.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:38.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:38.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:38.700 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:38.699+0000 7fc7c0c808c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:53:39.675 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:53:39.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:39.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:39.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:53:39.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:39.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2521390917,v1:127.0.0.1:6819/2521390917] [v2:127.0.0.1:6820/2521390917,v1:127.0.0.1:6821/2521390917] exists,up 34780a38-258e-4357-b763-7646e1e3c02f 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:400: TEST_auto_repair_bluestore_tag: create_pool testpool 1 1 2026-03-08T22:53:39.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 2026-03-08T22:53:40.104 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T22:53:40.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:53:41.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:401: TEST_auto_repair_bluestore_tag: ceph osd pool set testpool size 2 2026-03-08T22:53:41.332 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T22:53:41.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:402: TEST_auto_repair_bluestore_tag: wait_for_clean 2026-03-08T22:53:41.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:53:41.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:53:41.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:53:41.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:41.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:41.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836484 2026-03-08T22:53:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836484 2026-03-08T22:53:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836484' 2026-03-08T22:53:41.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:41.694 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:41.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T22:53:41.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T22:53:41.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836484 1-42949672963' 2026-03-08T22:53:41.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:41.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:41.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T22:53:41.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T22:53:41.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836484 1-42949672963 2-64424509442' 2026-03-08T22:53:41.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:41.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836484 2026-03-08T22:53:41.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:41.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:41.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836484 2026-03-08T22:53:41.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:41.867 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836484 2026-03-08T22:53:41.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836484 2026-03-08T22:53:41.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836484' 2026-03-08T22:53:41.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:42.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836484 2026-03-08T22:53:42.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:43.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:43.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:43.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836484 2026-03-08T22:53:43.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:43.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T22:53:43.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:43.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:43.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T22:53:43.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:43.240 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:53:43.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T22:53:43.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T22:53:43.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:43.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T22:53:43.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:43.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T22:53:43.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:43.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:43.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T22:53:43.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:43.435 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:53:43.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T22:53:43.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T22:53:43.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:43.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T22:53:43.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:53:43.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:43.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:53:43.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:53:44.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:53:44.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:53:44.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:44.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:44.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:53:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:53:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:53:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:405: TEST_auto_repair_bluestore_tag: local payload=ABCDEF 2026-03-08T22:53:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:406: TEST_auto_repair_bluestore_tag: echo ABCDEF 2026-03-08T22:53:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:407: TEST_auto_repair_bluestore_tag: rados --pool testpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:411: TEST_auto_repair_bluestore_tag: get_not_primary testpool SOMETHING 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:44.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:53:44.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:53:44.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:44.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:411: TEST_auto_repair_bluestore_tag: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:53:44.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:53:44.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:53:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:53:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T22:53:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:53:44.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:53:44.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:53:44.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:53:44.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:53:44.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T22:53:45.360 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:45.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:53:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:53:45.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:53:45.898 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:53:45.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:53:45.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:53:45.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:53:45.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:53:45.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:53:45.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:53:45.919 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:45.915+0000 7fe7080f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:45.919 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:45.915+0000 7fe7080f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:45.921 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:45.919+0000 7fe7080f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:53:46.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:46.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:53:46.107 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:53:46.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:46.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:46.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:46.888 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:46.887+0000 7fe7080f48c0 -1 Falling back to public interface 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:47.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:47.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:53:47.848 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:53:47.847+0000 7fe7080f48c0 -1 osd.0 20 log_to_monitors true 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:53:48.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 0 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/3057572570,v1:127.0.0.1:6803/3057572570] [v2:127.0.0.1:6804/3057572570,v1:127.0.0.1:6805/3057572570] exists,up b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:53:48.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:48.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:48.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T22:53:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T22:53:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T22:53:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:49.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:49.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672966 2026-03-08T22:53:49.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672966 2026-03-08T22:53:49.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672966' 2026-03-08T22:53:49.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:49.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672966 2-64424509444' 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T22:53:49.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:49.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:49.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T22:53:49.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:49.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T22:53:49.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T22:53:49.189 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T22:53:49.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:49.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T22:53:49.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:49.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672966 2026-03-08T22:53:49.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:49.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:49.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672966 2026-03-08T22:53:49.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:49.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672966 2026-03-08T22:53:49.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672966' 2026-03-08T22:53:49.368 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672966 2026-03-08T22:53:49.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:49.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672966 2026-03-08T22:53:49.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:50.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:50.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:50.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672966 2026-03-08T22:53:50.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:51.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:53:51.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:51.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672966 2026-03-08T22:53:51.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:51.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T22:53:51.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:51.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:51.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T22:53:51.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:51.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T22:53:51.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T22:53:51.919 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509444 2026-03-08T22:53:51.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:52.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509444 2026-03-08T22:53:52.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:53:52.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:52.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:53:52.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:53:52.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:53:52.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:53:52.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:53:52.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:413: TEST_auto_repair_bluestore_tag: get_pg testpool SOMETHING 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:413: TEST_auto_repair_bluestore_tag: local pgid=1.0 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:414: TEST_auto_repair_bluestore_tag: get_primary testpool SOMETHING 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:53:52.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stdout:Affected PG 1.0 w/ primary 1 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:414: TEST_auto_repair_bluestore_tag: local primary=1 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:415: TEST_auto_repair_bluestore_tag: echo 'Affected PG ' 1.0 ' w/ primary ' 1 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:416: TEST_auto_repair_bluestore_tag: get_last_scrub_stamp 1.0 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:53:53.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:416: TEST_auto_repair_bluestore_tag: local last_scrub_stamp=2026-03-08T22:53:40.101819+0000 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:417: TEST_auto_repair_bluestore_tag: initiate_and_fetch_state 1 1.0 3.0 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:322: initiate_and_fetch_state: local the_osd=osd.1 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:323: initiate_and_fetch_state: local pgid=1.0 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:324: initiate_and_fetch_state: get_last_scrub_stamp 1.0 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:53:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:324: initiate_and_fetch_state: local last_scrub=2026-03-08T22:53:40.101819+0000 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:326: initiate_and_fetch_state: set_config osd 1 osd_scrub_sleep 3.0 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=osd_scrub_sleep 2026-03-08T22:53:53.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=3.0 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:53.509 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:53.510 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T22:53:53.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set osd_scrub_sleep 3.0 2026-03-08T22:53:53.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:327: initiate_and_fetch_state: set_config osd 1 osd_scrub_auto_repair true 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=osd_scrub_auto_repair 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:53:53.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:53:53.586 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:53.586 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:53.586 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:53.586 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T22:53:53.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set osd_scrub_auto_repair true 2026-03-08T22:53:53.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T22:53:53.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:329: initiate_and_fetch_state: flush_pg_stats 2026-03-08T22:53:53.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:53.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:53.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:53.832 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:53.832 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:53.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:53.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:53.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215108 2026-03-08T22:53:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215108 2026-03-08T22:53:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215108' 2026-03-08T22:53:53.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:53.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:54.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T22:53:54.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T22:53:54.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215108 1-42949672968' 2026-03-08T22:53:54.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:54.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509446 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509446 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215108 1-42949672968 2-64424509446' 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215108 2026-03-08T22:53:54.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:54.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:54.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215108 2026-03-08T22:53:54.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:54.160 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 103079215108 2026-03-08T22:53:54.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215108 2026-03-08T22:53:54.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215108' 2026-03-08T22:53:54.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:54.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215107 -lt 103079215108 2026-03-08T22:53:54.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:53:55.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:53:55.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215108 -lt 103079215108 2026-03-08T22:53:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:55.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T22:53:55.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:55.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:53:55.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T22:53:55.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:55.522 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672968 2026-03-08T22:53:55.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T22:53:55.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T22:53:55.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:53:55.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672968 2026-03-08T22:53:55.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:55.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509446 2026-03-08T22:53:55.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:55.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:53:55.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509446 2026-03-08T22:53:55.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:55.699 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509446 2026-03-08T22:53:55.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509446 2026-03-08T22:53:55.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509446' 2026-03-08T22:53:55.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:53:55.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509447 -lt 64424509446 2026-03-08T22:53:55.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:330: initiate_and_fetch_state: date --rfc-3339=ns 2026-03-08T22:53:55.886 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08 22:53:55.887987693+00:00 2026-03-08T22:53:55.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:333: initiate_and_fetch_state: get_asok_path osd.1 2026-03-08T22:53:55.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T22:53:55.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T22:53:55.887 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:55.888 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:55.888 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:55.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T22:53:55.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:333: initiate_and_fetch_state: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok schedule-deep-scrub 1.0 2026-03-08T22:53:55.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:336: initiate_and_fetch_state: (( i=0 )) 2026-03-08T22:53:55.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:336: initiate_and_fetch_state: (( i < 80 )) 2026-03-08T22:53:55.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: ceph pg 1.0 query --format json 2026-03-08T22:53:55.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: jq .state 2026-03-08T22:53:56.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: st='"active+clean"' 2026-03-08T22:53:56.058 INFO:tasks.workunit.client.0.vm03.stdout:{"deep":true,"must":false,"stamp":"2026-02-22T22:52:15.956872+0000"}0 ) state now: "active+clean" 2026-03-08T22:53:56.058 INFO:tasks.workunit.client.0.vm03.stdout:"active+clean" 2026-03-08T22:53:56.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:339: initiate_and_fetch_state: echo 0 ') state now: ' '"active+clean"' 2026-03-08T22:53:56.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:341: initiate_and_fetch_state: case "$st" in 2026-03-08T22:53:56.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:346: initiate_and_fetch_state: echo '"active+clean"' 2026-03-08T22:53:56.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:349: initiate_and_fetch_state: '[' 0 == 4 ']' 2026-03-08T22:53:56.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:352: initiate_and_fetch_state: sleep 0.3 2026-03-08T22:53:56.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:336: initiate_and_fetch_state: (( i++ )) 2026-03-08T22:53:56.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:336: initiate_and_fetch_state: (( i < 80 )) 2026-03-08T22:53:56.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: ceph pg 1.0 query --format json 2026-03-08T22:53:56.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: jq .state 2026-03-08T22:53:56.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:338: initiate_and_fetch_state: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stdout:1 ) state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:339: initiate_and_fetch_state: echo 1 ') state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:341: initiate_and_fetch_state: case "$st" in 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stdout:found scrub 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:343: initiate_and_fetch_state: echo 'found scrub' 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:343: initiate_and_fetch_state: return 0 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:418: TEST_auto_repair_bluestore_tag: r=0 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:419: TEST_auto_repair_bluestore_tag: echo 'initiate_and_fetch_state ret: ' 0 2026-03-08T22:53:56.449 INFO:tasks.workunit.client.0.vm03.stdout:initiate_and_fetch_state ret: 0 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:420: TEST_auto_repair_bluestore_tag: set_config osd td/osd-scrub-repair osd_scrub_sleep 0 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=td/osd-scrub-repair 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=osd_scrub_sleep 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=0 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.td/osd-scrub-repair 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.td/osd-scrub-repair 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.td/osd-scrub-repair ']' 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.td/osd-scrub-repair.asok 2026-03-08T22:53:56.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.td/osd-scrub-repair.asok config set osd_scrub_sleep 0 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test == true 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh: line 1168: test: ==: unary operator expected 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:421: TEST_auto_repair_bluestore_tag: '[' 0 -ne 0 ']' 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:425: TEST_auto_repair_bluestore_tag: wait_end_of_scrub 1 1.0 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:360: wait_end_of_scrub: local the_osd=osd.1 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:361: wait_end_of_scrub: local pgid=1.0 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i=0 )) 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:53:56.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean+scrubbing+deep" =~ (.*scrubbing.*) ]] 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:367: wait_end_of_scrub: '[' 0 == 4 ']' 2026-03-08T22:53:56.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:370: wait_end_of_scrub: sleep 0.3 2026-03-08T22:53:56.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i++ )) 2026-03-08T22:53:56.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:53:56.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:53:56.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:53:56.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.987 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:56.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:56.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean+scrubbing+deep" =~ (.*scrubbing.*) ]] 2026-03-08T22:53:56.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:367: wait_end_of_scrub: '[' 1 == 4 ']' 2026-03-08T22:53:56.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:370: wait_end_of_scrub: sleep 0.3 2026-03-08T22:53:57.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i++ )) 2026-03-08T22:53:57.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:53:57.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:53:57.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:53:57.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:57.463 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:57.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:57.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean+scrubbing+deep" =~ (.*scrubbing.*) ]] 2026-03-08T22:53:57.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:367: wait_end_of_scrub: '[' 2 == 4 ']' 2026-03-08T22:53:57.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:370: wait_end_of_scrub: sleep 0.3 2026-03-08T22:53:57.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i++ )) 2026-03-08T22:53:57.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:53:57.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:53:57.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean+scrubbing+deep" =~ (.*scrubbing.*) ]] 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:367: wait_end_of_scrub: '[' 3 == 4 ']' 2026-03-08T22:53:57.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:370: wait_end_of_scrub: sleep 0.3 2026-03-08T22:53:58.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i++ )) 2026-03-08T22:53:58.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:53:58.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:53:58.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:53:58.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean+scrubbing+deep"' 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean+scrubbing+deep"' 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean+scrubbing+deep" 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean+scrubbing+deep" =~ (.*scrubbing.*) ]] 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:367: wait_end_of_scrub: '[' 4 == 4 ']' 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:368: wait_end_of_scrub: flush_pg_stats 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:53:58.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:58.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:53:58.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215110 2026-03-08T22:53:58.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215110 2026-03-08T22:53:58.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215110' 2026-03-08T22:53:58.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:58.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:53:58.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T22:53:58.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T22:53:58.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215110 1-42949672970' 2026-03-08T22:53:58.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:53:58.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509448 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509448 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215110 1-42949672970 2-64424509448' 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215110 2026-03-08T22:53:58.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:53:58.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:53:58.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215110 2026-03-08T22:53:58.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:53:58.833 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 103079215110 2026-03-08T22:53:58.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215110 2026-03-08T22:53:58.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215110' 2026-03-08T22:53:58.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:53:59.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215108 -lt 103079215110 2026-03-08T22:53:59.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:00.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:00.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215110 -lt 103079215110 2026-03-08T22:54:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:00.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T22:54:00.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:00.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:00.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T22:54:00.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:00.230 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672970 2026-03-08T22:54:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T22:54:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T22:54:00.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:00.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T22:54:00.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:00.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509448 2026-03-08T22:54:00.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:00.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:00.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509448 2026-03-08T22:54:00.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:00.412 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509448 2026-03-08T22:54:00.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509448 2026-03-08T22:54:00.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509448' 2026-03-08T22:54:00.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:00.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509448 -lt 64424509448 2026-03-08T22:54:00.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:370: wait_end_of_scrub: sleep 0.3 2026-03-08T22:54:00.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i++ )) 2026-03-08T22:54:00.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:363: wait_end_of_scrub: (( i < 40 )) 2026-03-08T22:54:00.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: ceph pg 1.0 query --format json 2026-03-08T22:54:00.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: jq .state 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stdout:wait-scrub-end state now: "active+clean" 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:364: wait_end_of_scrub: st='"active+clean"' 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:365: wait_end_of_scrub: echo 'wait-scrub-end state now: ' '"active+clean"' 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: [[ "active+clean" =~ (.*scrubbing.*) ]] 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:366: wait_end_of_scrub: break 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:373: wait_end_of_scrub: [[ "active+clean" =~ (.*scrubbing.*) ]] 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:378: wait_end_of_scrub: return 0 2026-03-08T22:54:00.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:426: TEST_auto_repair_bluestore_tag: ceph pg dump pgs 2026-03-08T22:54:01.165 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T22:54:01.165 INFO:tasks.workunit.client.0.vm03.stdout:1.0 1 0 0 0 0 7 0 0 1 0 1 active+clean+scrubbing+deep 2026-03-08T22:53:56.413462+0000 20'1 25:46 [1,0] 1 [1,0] 1 0'0 2026-02-22T22:52:15.956872+0000 0'0 2026-02-22T22:52:15.956872+0000 0 0 deep scrubbing for 2s 0 0 2026-03-08T22:54:01.165 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T22:54:01.165 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T22:54:01.165 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:430: TEST_auto_repair_bluestore_tag: get_not_primary testpool SOMETHING 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:54:01.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:54:01.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:54:01.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:54:01.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:430: TEST_auto_repair_bluestore_tag: objectstore_tool td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:54:01.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:01.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:54:01.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING list-attrs 2026-03-08T22:54:01.967 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:54:01.967 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:54:02.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:54:02.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:54:02.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:54:02.501 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:54:02.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:54:02.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:54:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:54:02.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:54:02.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:54:02.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:54:02.519 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:02.515+0000 7f160c5f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:02.520 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:02.519+0000 7f160c5f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:02.522 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:02.519+0000 7f160c5f78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:02.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:54:02.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:02.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:02.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:02.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:03.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:03.707+0000 7f160c5f78c0 -1 Falling back to public interface 2026-03-08T22:54:03.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:03.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:03.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:03.866 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:03.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:03.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:04.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:04.934 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:04.931+0000 7f160c5f78c0 -1 osd.0 25 log_to_monitors true 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:05.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 29 up_thru 0 down_at 26 last_clean_interval [24,25) [v2:127.0.0.1:6802/3194458349,v1:127.0.0.1:6803/3194458349] [v2:127.0.0.1:6804/3194458349,v1:127.0.0.1:6805/3194458349] exists,up b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:05.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:05.238 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:05.238 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:05.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:05.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:05.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:05.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:05.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051586 2026-03-08T22:54:05.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051586 2026-03-08T22:54:05.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586' 2026-03-08T22:54:05.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:05.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:05.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672972 2026-03-08T22:54:05.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672972 2026-03-08T22:54:05.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672972' 2026-03-08T22:54:05.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:05.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509451 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509451 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-124554051586 1-42949672972 2-64424509451' 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-124554051586 2026-03-08T22:54:05.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:05.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:05.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-124554051586 2026-03-08T22:54:05.796 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:05.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051586 2026-03-08T22:54:05.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 124554051586' 2026-03-08T22:54:05.797 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 124554051586 2026-03-08T22:54:05.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:05.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 124554051586 2026-03-08T22:54:05.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:06.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:06.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:07.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051586 -lt 124554051586 2026-03-08T22:54:07.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:07.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672972 2026-03-08T22:54:07.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:07.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:07.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672972 2026-03-08T22:54:07.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:07.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672972 2026-03-08T22:54:07.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672972' 2026-03-08T22:54:07.162 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672972 2026-03-08T22:54:07.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:07.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672972 -lt 42949672972 2026-03-08T22:54:07.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:07.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509451 2026-03-08T22:54:07.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:07.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:07.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509451 2026-03-08T22:54:07.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:07.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509451 2026-03-08T22:54:07.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509451' 2026-03-08T22:54:07.352 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509451 2026-03-08T22:54:07.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:07.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509451 -lt 64424509451 2026-03-08T22:54:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:07.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:07.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:07.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:07.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:54:07.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:07.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:07.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:08.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:431: TEST_auto_repair_bluestore_tag: get_not_primary testpool SOMETHING 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool SOMETHING 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:54:08.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:54:08.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T22:54:08.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool SOMETHING 2026-03-08T22:54:08.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T22:54:08.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:431: TEST_auto_repair_bluestore_tag: objectstore_tool td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:54:08.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:54:08.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:54:08.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:08.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:54:08.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING get-bytes td/osd-scrub-repair/COPY 2026-03-08T22:54:09.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:54:09.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:09.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq' 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T22:54:09.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --osd-op-queue=wpq 2026-03-08T22:54:09.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T22:54:09.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T22:54:09.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:54:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:54:09.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:54:09.240 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:09.235+0000 7fca39d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:09.240 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:09.239+0000 7fca39d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:09.241 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:09.239+0000 7fca39d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:09.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:09.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:10.448 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:10.447+0000 7fca39d518c0 -1 Falling back to public interface 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:10.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:10.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:11.667 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:11.667+0000 7fca39d518c0 -1 osd.0 30 log_to_monitors true 2026-03-08T22:54:11.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:11.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:11.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:11.791 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:54:11.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:11.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 34 up_thru 0 down_at 31 last_clean_interval [29,30) [v2:127.0.0.1:6802/3483250176,v1:127.0.0.1:6803/3483250176] [v2:127.0.0.1:6804/3483250176,v1:127.0.0.1:6805/3483250176] exists,up b1c5fdb9-636f-41f9-943b-d9e337462a48 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:11.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:11.985 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:11.985 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:11.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:11.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:11.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:12.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:12.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:12.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T22:54:12.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T22:54:12.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066' 2026-03-08T22:54:12.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:12.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:12.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T22:54:12.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T22:54:12.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672974' 2026-03-08T22:54:12.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:12.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:12.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509453 2026-03-08T22:54:12.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509453 2026-03-08T22:54:12.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672974 2-64424509453' 2026-03-08T22:54:12.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:12.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-146028888066 2026-03-08T22:54:12.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:12.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:12.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-146028888066 2026-03-08T22:54:12.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:12.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T22:54:12.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 146028888066' 2026-03-08T22:54:12.503 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 146028888066 2026-03-08T22:54:12.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051586 -lt 146028888066 2026-03-08T22:54:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:13.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:13.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:13.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888066 -lt 146028888066 2026-03-08T22:54:13.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:13.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T22:54:13.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:13.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:13.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T22:54:13.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:13.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T22:54:13.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T22:54:13.880 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672974 2026-03-08T22:54:13.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:14.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672974 -lt 42949672974 2026-03-08T22:54:14.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:14.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509453 2026-03-08T22:54:14.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:14.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:14.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509453 2026-03-08T22:54:14.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:14.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509453 2026-03-08T22:54:14.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509453' 2026-03-08T22:54:14.075 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509453 2026-03-08T22:54:14.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:14.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509453 -lt 64424509453 2026-03-08T22:54:14.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:14.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:14.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:14.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T22:54:14.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:14.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:14.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:14.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:14.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:14.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:14.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:14.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T22:54:14.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:14.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:14.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:14.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T22:54:14.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:14.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:14.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:432: TEST_auto_repair_bluestore_tag: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:54:14.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:433: TEST_auto_repair_bluestore_tag: grep scrub_finish td/osd-scrub-repair/osd.1.log 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:53:59.411+0000 7fb34135f640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] planned AUTO_REPAIR TIME_FOR_DEEP] scrubber: scrub_finish before flags: AUTO_REPAIR. repair state: no-repair. deep_scrub_on_error: 0 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:53:59.411+0000 7fb34135f640 10 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+clean+scrubbing+deep [ 1.0: AUTO_REPAIR ] ] scrubber: _scrub_finish info stats: valid m_is_repair: 1 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:53:59.411+0000 7fb34135f640 15 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish: 1 errors. 1 errors fixed 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:53:59.411+0000 7fb34135f640 20 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish All may be fixed 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T22:53:59.411+0000 7fb34135f640 19 osd.1 pg_epoch: 25 pg[1.0( v 20'1 (0'0,20'1] local-lis/les=24/25 n=1 ec=16/16 lis/c=24/24 les/c/f=25/25/0 sis=24) [1,0] r=0 lpr=24 crt=20'1 lcod 0'0 mlcod 0'0 active+scrubbing+deep+repair [ 1.0: AUTO_REPAIR ] mbc={255={(1+0)=1}}] scrubber: scrub_finish shard 1 num_omap_bytes = 0 num_omap_keys = 0 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:14.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:14.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:14.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:14.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:15.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:15.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:15.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:15.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:15.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:54:15.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:15.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:15.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:15.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:15.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:15.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:15.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:15.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:54:15.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:15.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:15.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:15.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:15.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:54:15.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:54:15.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:54:15.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:54:15.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:54:15.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:54:15.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:15.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:54:15.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:54:15.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:54:15.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:54:15.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:54:15.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:54:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:54:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:54:15.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:54:15.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:54:15.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:54:15.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:54:15.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_erasure_coded_appends td/osd-scrub-repair 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:309: TEST_auto_repair_erasure_coded_appends: auto_repair_erasure_coded td/osd-scrub-repair false 2026-03-08T22:54:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:268: auto_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:269: auto_repair_erasure_coded: local allow_overwrites=false 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:270: auto_repair_erasure_coded: local poolname=ecpool 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:273: auto_repair_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:54:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.099 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:15.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:54:15.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:54:15.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:54:15.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:15.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:15.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:54:15.127 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.128 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:54:15.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:15.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:54:15.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:54:15.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:54:15.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:54:15.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.194 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:54:15.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:54:15.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:274: auto_repair_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:54:15.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:15.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:15.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:54:15.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:279: auto_repair_erasure_coded: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:54:15.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: seq 0 2 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:15.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:15.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:15.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:15.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:15.407 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:15.407 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:15.407 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:15.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:54:15.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:54:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:15.409 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 2ba00b2e-2de2-4cfe-8bf6-83b31afaac65 2026-03-08T22:54:15.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2ba00b2e-2de2-4cfe-8bf6-83b31afaac65 2026-03-08T22:54:15.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 2ba00b2e-2de2-4cfe-8bf6-83b31afaac65' 2026-03-08T22:54:15.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:15.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAX/q1pLy0kGRAA42wi1VSZpxYsvmrBmDWNBg== 2026-03-08T22:54:15.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAX/q1pLy0kGRAA42wi1VSZpxYsvmrBmDWNBg=="}' 2026-03-08T22:54:15.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2ba00b2e-2de2-4cfe-8bf6-83b31afaac65 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:54:15.527 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:54:15.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:54:15.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQAX/q1pLy0kGRAA42wi1VSZpxYsvmrBmDWNBg== --osd-uuid 2ba00b2e-2de2-4cfe-8bf6-83b31afaac65 2026-03-08T22:54:15.565 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:15.563+0000 7fd5bc1da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:15.576 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:15.575+0000 7fd5bc1da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:15.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:15.575+0000 7fd5bc1da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:15.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:15.575+0000 7fd5bc1da8c0 -1 bdev(0x55f52fdcec00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:15.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:15.575+0000 7fd5bc1da8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:54:17.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:54:17.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:17.850 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:54:17.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:54:17.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:17.963 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:54:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:54:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:17.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:17.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:17.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:18.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:18.011+0000 7fbf28cd58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:18.028 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:18.027+0000 7fbf28cd58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:18.039 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:18.035+0000 7fbf28cd58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:18.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:18.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:18.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:18.992 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:18.991+0000 7fbf28cd58c0 -1 Falling back to public interface 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:19.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:19.968 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:19.967+0000 7fbf28cd58c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:20.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:20.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:20.938 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:20.935+0000 7fbf2448e640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:54:21.625 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:54:21.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:21.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:21.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:21.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:21.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:54:21.799 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4185355264,v1:127.0.0.1:6803/4185355264] [v2:127.0.0.1:6804/4185355264,v1:127.0.0.1:6805/4185355264] exists,up 2ba00b2e-2de2-4cfe-8bf6-83b31afaac65 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:21.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:54:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:54:21.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:21.803 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:54:21.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:54:21.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 63dee2e8-ea0d-40a0-b127-00c3afd1316b' 2026-03-08T22:54:21.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:21.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAd/q1pwHS2MBAAhKVzEl4ktezyM+nyd/qvig== 2026-03-08T22:54:21.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAd/q1pwHS2MBAAhKVzEl4ktezyM+nyd/qvig=="}' 2026-03-08T22:54:21.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 63dee2e8-ea0d-40a0-b127-00c3afd1316b -i td/osd-scrub-repair/1/new.json 2026-03-08T22:54:22.048 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:54:22.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:54:22.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQAd/q1pwHS2MBAAhKVzEl4ktezyM+nyd/qvig== --osd-uuid 63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:54:22.082 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:22.079+0000 7f4a7398d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:22.084 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:22.083+0000 7f4a7398d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:22.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:22.083+0000 7f4a7398d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:22.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:22.083+0000 7f4a7398d8c0 -1 bdev(0x55f3a42e7c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:22.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:22.083+0000 7f4a7398d8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:54:24.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:54:24.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:24.341 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:54:24.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:54:24.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:24.555 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:54:24.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:54:24.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:24.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:24.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:24.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:24.573 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:24.571+0000 7f26c3c958c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:24.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:24.579+0000 7f26c3c958c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:24.581 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:24.579+0000 7f26c3c958c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:24.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:25.544 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:25.543+0000 7f26c3c958c0 -1 Falling back to public interface 2026-03-08T22:54:25.922 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:54:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:26.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:26.534 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:26.531+0000 7f26c3c958c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:54:27.101 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:54:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:27.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:27.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:28.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:28.490 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1214911622,v1:127.0.0.1:6811/1214911622] [v2:127.0.0.1:6812/1214911622,v1:127.0.0.1:6813/1214911622] exists,up 63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:54:28.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:54:28.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:54:28.493 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619 2026-03-08T22:54:28.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619 2026-03-08T22:54:28.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619' 2026-03-08T22:54:28.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:54:28.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAk/q1pTtYrHhAA1W26wWPeULufRBE0mJh8Ew== 2026-03-08T22:54:28.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAk/q1pTtYrHhAA1W26wWPeULufRBE0mJh8Ew=="}' 2026-03-08T22:54:28.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:54:28.690 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:54:28.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:54:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQAk/q1pTtYrHhAA1W26wWPeULufRBE0mJh8Ew== --osd-uuid b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619 2026-03-08T22:54:28.726 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:28.724+0000 7f61972b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:28.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:28.728+0000 7f61972b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:28.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:28.728+0000 7f61972b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:28.731 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:28.732+0000 7f61972b88c0 -1 bdev(0x56487b9a1c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:54:28.731 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:28.732+0000 7f61972b88c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:54:31.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:54:31.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:54:31.059 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:54:31.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:54:31.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:54:31.269 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:54:31.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:54:31.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:31.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:54:31.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:54:31.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:54:31.286 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:31.284+0000 7f0c160aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:31.291 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:31.292+0000 7f0c160aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:31.293 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:31.292+0000 7f0c160aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:31.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:31.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:32.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:32.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:32.736+0000 7f0c160aa8c0 -1 Falling back to public interface 2026-03-08T22:54:32.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:33.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:33.708+0000 7f0c160aa8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:54:33.819 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:54:33.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:33.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:33.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:33.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:33.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:34.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:35.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:35.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:36.385 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T22:54:36.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:36.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:36.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:54:36.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:36.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1985857799,v1:127.0.0.1:6819/1985857799] [v2:127.0.0.1:6820/1985857799,v1:127.0.0.1:6821/1985857799] exists,up b0ff8a6b-06e7-4f86-bd39-0e8cae7f7619 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:283: auto_repair_erasure_coded: create_rbd_pool 2026-03-08T22:54:36.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:54:36.720 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T22:54:36.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:54:36.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:54:36.940 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T22:54:36.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:54:37.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:284: auto_repair_erasure_coded: wait_for_clean 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:38.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:38.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:38.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:38.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:38.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:54:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:54:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:54:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:38.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:38.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705668 2026-03-08T22:54:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705668 2026-03-08T22:54:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705668' 2026-03-08T22:54:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:38.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:38.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542146 2026-03-08T22:54:38.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542146 2026-03-08T22:54:38.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705668 2-60129542146' 2026-03-08T22:54:38.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:38.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:54:38.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:38.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:38.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:54:38.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:38.784 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:54:38.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:54:38.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:54:38.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:38.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T22:54:38.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:39.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:39.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:40.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T22:54:40.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:41.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:54:41.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:41.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T22:54:41.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:41.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705668 2026-03-08T22:54:41.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:41.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:41.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705668 2026-03-08T22:54:41.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:41.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705668 2026-03-08T22:54:41.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705668' 2026-03-08T22:54:41.331 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705668 2026-03-08T22:54:41.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705668 -lt 38654705668 2026-03-08T22:54:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:41.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542146 2026-03-08T22:54:41.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:41.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:41.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542146 2026-03-08T22:54:41.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:41.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542146 2026-03-08T22:54:41.506 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542146 2026-03-08T22:54:41.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542146' 2026-03-08T22:54:41.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:41.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542146 -lt 60129542146 2026-03-08T22:54:41.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:41.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:41.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:41.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:54:41.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:41.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:41.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:41.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:41.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:41.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:41.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:42.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:54:42.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:42.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:42.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:42.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:54:42.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:42.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:287: auto_repair_erasure_coded: create_ec_pool ecpool false k=2 m=1 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T22:54:42.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 2026-03-08T22:54:42.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T22:54:42.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T22:54:42.907 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T22:54:42.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:43.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:43.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:44.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:44.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T22:54:44.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T22:54:44.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T22:54:44.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:44.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705670 2026-03-08T22:54:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705670 2026-03-08T22:54:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-38654705670' 2026-03-08T22:54:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:44.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:44.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542148 2026-03-08T22:54:44.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542148 2026-03-08T22:54:44.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-38654705670 2-60129542148' 2026-03-08T22:54:44.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:44.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T22:54:44.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:44.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:44.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T22:54:44.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:44.440 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T22:54:44.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T22:54:44.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T22:54:44.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:44.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T22:54:44.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:45.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:45.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:45.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T22:54:45.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:46.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:54:46.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:46.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836487 2026-03-08T22:54:46.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:46.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705670 2026-03-08T22:54:46.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:46.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:46.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705670 2026-03-08T22:54:46.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:46.971 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705670 2026-03-08T22:54:46.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705670 2026-03-08T22:54:46.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705670' 2026-03-08T22:54:46.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:47.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705670 -lt 38654705670 2026-03-08T22:54:47.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:47.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542148 2026-03-08T22:54:47.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:47.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:47.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542148 2026-03-08T22:54:47.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:47.147 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542148 2026-03-08T22:54:47.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542148 2026-03-08T22:54:47.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542148' 2026-03-08T22:54:47.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:47.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542149 -lt 60129542148 2026-03-08T22:54:47.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:47.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:47.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:47.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:54:47.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:47.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:47.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:54:47.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:47.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:47.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:290: auto_repair_erasure_coded: local payload=ABCDEF 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:291: auto_repair_erasure_coded: echo ABCDEF 2026-03-08T22:54:47.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:292: auto_repair_erasure_coded: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:296: auto_repair_erasure_coded: get_not_primary ecpool SOMETHING 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=ecpool 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary ecpool SOMETHING 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:54:48.003 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:54:48.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=2 2026-03-08T22:54:48.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:54:48.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 2)) | .[0]' 2026-03-08T22:54:48.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:296: auto_repair_erasure_coded: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:54:48.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:54:48.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:54:49.119 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:54:49.120 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T22:54:49.120 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#-3:00000000:::scrub_1.0:head#, (61) No data available 2026-03-08T22:54:49.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:49.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:54:49.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:54:49.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:54:49.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:54:49.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:54:49.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:54:49.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:54:49.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:54:49.672 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:49.668+0000 7fe9c40578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:49.674 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:49.672+0000 7fe9c40578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:49.676 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:49.676+0000 7fe9c40578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:54:49.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:49.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:50.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:50.396 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:50.396+0000 7fe9c40578c0 -1 Falling back to public interface 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:51.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:51.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:51.619 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:51.616+0000 7fe9c40578c0 -1 osd.1 28 log_to_monitors true 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:52.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:52.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:54:52.974 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:54:52.972+0000 7fe9bb007640 -1 osd.1 28 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:54:53.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:54:53.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:54:53.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:54:53.565 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:54:53.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:54:53.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 32 up_thru 32 down_at 29 last_clean_interval [9,28) [v2:127.0.0.1:6810/3681187177,v1:127.0.0.1:6811/3681187177] [v2:127.0.0.1:6812/3681187177,v1:127.0.0.1:6813/3681187177] exists,up 63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:54:53.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:54:53.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:54:53.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:54.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:54:54.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836490 2026-03-08T22:54:54.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836490 2026-03-08T22:54:54.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490' 2026-03-08T22:54:54.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:54.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:54:54.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=137438953474 2026-03-08T22:54:54.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 137438953474 2026-03-08T22:54:54.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-137438953474' 2026-03-08T22:54:54.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:54:54.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:54:54.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542151 2026-03-08T22:54:54.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542151 2026-03-08T22:54:54.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-137438953474 2-60129542151' 2026-03-08T22:54:54.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:54.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836490 2026-03-08T22:54:54.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:54.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:54:54.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836490 2026-03-08T22:54:54.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:54.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836490 2026-03-08T22:54:54.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836490' 2026-03-08T22:54:54.265 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836490 2026-03-08T22:54:54.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:54:54.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836490 2026-03-08T22:54:54.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:54.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-137438953474 2026-03-08T22:54:54.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:54.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:54:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-137438953474 2026-03-08T22:54:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:54.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=137438953474 2026-03-08T22:54:54.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 137438953474' 2026-03-08T22:54:54.456 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 137438953474 2026-03-08T22:54:54.456 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:54.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 137438953474 2026-03-08T22:54:54.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:55.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:54:55.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:55.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 137438953474 2026-03-08T22:54:55.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:54:56.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:54:56.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:54:57.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 137438953474 -lt 137438953474 2026-03-08T22:54:57.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:54:57.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542151 2026-03-08T22:54:57.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:54:57.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:54:57.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542151 2026-03-08T22:54:57.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:54:57.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542151 2026-03-08T22:54:57.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542151' 2026-03-08T22:54:57.067 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542151 2026-03-08T22:54:57.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:54:57.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542152 -lt 60129542151 2026-03-08T22:54:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:54:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:54:57.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:54:57.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:54:57.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:54:57.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:54:57.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:54:57.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:54:57.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:298: auto_repair_erasure_coded: get_pg ecpool SOMETHING 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:54:57.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:298: auto_repair_erasure_coded: local pgid=2.0 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:299: auto_repair_erasure_coded: get_last_scrub_stamp 2.0 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:54:58.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:299: auto_repair_erasure_coded: wait_for_scrub 2.0 2026-03-08T22:54:42.898226+0000 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:54:42.898226+0000 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:54:58.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:54:58.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:54:42.898226+0000 '>' 2026-03-08T22:54:42.898226+0000 2026-03-08T22:54:58.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:54:59.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:54:59.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:54:59.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:54:59.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:54:59.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:54:59.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:54:59.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:54:59.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:54:42.898226+0000 '>' 2026-03-08T22:54:42.898226+0000 2026-03-08T22:54:59.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:00.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:00.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:00.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:54:42.898226+0000 '>' 2026-03-08T22:54:42.898226+0000 2026-03-08T22:55:00.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:01.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:02.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:54:42.898226+0000 '>' 2026-03-08T22:54:42.898226+0000 2026-03-08T22:55:02.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:03.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:54:57.454454+0000 '>' 2026-03-08T22:54:42.898226+0000 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:300: auto_repair_erasure_coded: wait_for_clean 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:03.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:03.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:03.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836493 2026-03-08T22:55:03.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836493 2026-03-08T22:55:03.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493' 2026-03-08T22:55:03.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:03.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=137438953477 2026-03-08T22:55:03.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 137438953477 2026-03-08T22:55:03.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-137438953477' 2026-03-08T22:55:03.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:03.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542154 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542154 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-137438953477 2-60129542154' 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836493 2026-03-08T22:55:03.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:03.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:03.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836493 2026-03-08T22:55:03.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:03.710 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836493 2026-03-08T22:55:03.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836493 2026-03-08T22:55:03.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836493' 2026-03-08T22:55:03.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:03.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836493 2026-03-08T22:55:03.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:04.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:04.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:05.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836493 2026-03-08T22:55:05.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:05.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-137438953477 2026-03-08T22:55:05.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:05.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:05.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-137438953477 2026-03-08T22:55:05.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:05.080 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 137438953477 2026-03-08T22:55:05.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=137438953477 2026-03-08T22:55:05.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 137438953477' 2026-03-08T22:55:05.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 137438953477 -lt 137438953477 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542154 2026-03-08T22:55:05.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:05.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:05.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542154 2026-03-08T22:55:05.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:05.265 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542154 2026-03-08T22:55:05.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542154 2026-03-08T22:55:05.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542154' 2026-03-08T22:55:05.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:05.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542154 -lt 60129542154 2026-03-08T22:55:05.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:05.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:05.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:05.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:05.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:05.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:05.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:05.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:06.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:303: auto_repair_erasure_coded: get_not_primary ecpool SOMETHING 2026-03-08T22:55:06.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=ecpool 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary ecpool SOMETHING 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:06.068 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:55:06.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=2 2026-03-08T22:55:06.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:06.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 2)) | .[0]' 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:303: auto_repair_erasure_coded: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:06.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:55:07.099 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:55:07.099 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:55:07.099 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:55:07.100 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T22:55:07.100 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#-3:00000000:::scrub_1.0:head#, (61) No data available 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:07.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:55:07.380 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:55:07.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:07.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:55:07.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:55:07.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:07.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:07.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:07.398 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:07.392+0000 7fe0873788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:07.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:07.396+0000 7fe0873788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:07.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:07.396+0000 7fe0873788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:07.587 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:55:07.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:07.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:08.364 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:08.360+0000 7fe0873788c0 -1 Falling back to public interface 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:08.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:08.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:09.624 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:09.620+0000 7fe0873788c0 -1 osd.1 34 log_to_monitors true 2026-03-08T22:55:09.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:09.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:09.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:09.967 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:55:09.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:09.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:10.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:10.562 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:10.556+0000 7fe07e328640 -1 osd.1 34 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:11.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 38 up_thru 38 down_at 35 last_clean_interval [32,34) [v2:127.0.0.1:6810/1519218756,v1:127.0.0.1:6811/1519218756] [v2:127.0.0.1:6812/1519218756,v1:127.0.0.1:6813/1519218756] exists,up 63dee2e8-ea0d-40a0-b127-00c3afd1316b 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:11.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:11.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:11.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:11.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:11.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:11.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T22:55:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T22:55:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T22:55:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:11.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:11.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=163208757250 2026-03-08T22:55:11.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 163208757250 2026-03-08T22:55:11.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-163208757250' 2026-03-08T22:55:11.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:11.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542157 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542157 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-163208757250 2-60129542157' 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T22:55:11.869 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:11.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:11.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T22:55:11.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:11.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T22:55:11.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T22:55:11.871 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836496 2026-03-08T22:55:11.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:12.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836496 2026-03-08T22:55:12.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:13.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:13.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:13.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T22:55:13.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:13.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-163208757250 2026-03-08T22:55:13.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:13.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:13.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-163208757250 2026-03-08T22:55:13.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:13.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=163208757250 2026-03-08T22:55:13.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 163208757250' 2026-03-08T22:55:13.233 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 163208757250 2026-03-08T22:55:13.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:13.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757250 -lt 163208757250 2026-03-08T22:55:13.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:13.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542157 2026-03-08T22:55:13.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:13.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:13.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542157 2026-03-08T22:55:13.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:13.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542157 2026-03-08T22:55:13.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542157' 2026-03-08T22:55:13.417 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542157 2026-03-08T22:55:13.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:13.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542157 -lt 60129542157 2026-03-08T22:55:13.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:13.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:13.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:13.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:13.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:13.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:13.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:13.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:13.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:13.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:13.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:14.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:14.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:14.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:14.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:14.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:14.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:14.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:14.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:304: auto_repair_erasure_coded: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:55:14.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:305: auto_repair_erasure_coded: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:14.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:14.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:14.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:14.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:14.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:14.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:55:14.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:14.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:14.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:14.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:14.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:14.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:14.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:14.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:14.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:55:14.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:14.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:14.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:14.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:14.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:14.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:14.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:14.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:55:14.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:55:14.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:55:14.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:55:14.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:55:14.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:55:14.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:14.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:55:14.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:55:14.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:55:14.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:55:14.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:55:14.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:55:14.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:55:14.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:55:14.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:55:14.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:55:14.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:55:14.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:55:14.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:55:14.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_auto_repair_erasure_coded_overwrites td/osd-scrub-repair 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:313: TEST_auto_repair_erasure_coded_overwrites: '[' true = true ']' 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:314: TEST_auto_repair_erasure_coded_overwrites: auto_repair_erasure_coded td/osd-scrub-repair true 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:268: auto_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:269: auto_repair_erasure_coded: local allow_overwrites=true 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:270: auto_repair_erasure_coded: local poolname=ecpool 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:273: auto_repair_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:55:14.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:55:14.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:14.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:55:14.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:55:14.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:55:14.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:14.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:14.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.497 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:55:14.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:14.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:55:14.577 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:55:14.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:274: auto_repair_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:55:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:14.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:14.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:55:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:279: auto_repair_erasure_coded: local 'ceph_osd_args=--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:14.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: seq 0 2 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 0 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:14.803 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:14.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:55:14.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:14.806 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 b2693d37-3421-42de-b8ee-5b150f7af365 2026-03-08T22:55:14.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b2693d37-3421-42de-b8ee-5b150f7af365 2026-03-08T22:55:14.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 b2693d37-3421-42de-b8ee-5b150f7af365' 2026-03-08T22:55:14.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBS/q1pEI7LMBAAv+pfij/W8bcXrDSyL2yf+w== 2026-03-08T22:55:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBS/q1pEI7LMBAAv+pfij/W8bcXrDSyL2yf+w=="}' 2026-03-08T22:55:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b2693d37-3421-42de-b8ee-5b150f7af365 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:55:14.920 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:55:14.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:55:14.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBS/q1pEI7LMBAAv+pfij/W8bcXrDSyL2yf+w== --osd-uuid b2693d37-3421-42de-b8ee-5b150f7af365 2026-03-08T22:55:14.953 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:14.948+0000 7f9f1141a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:14.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:14.952+0000 7f9f1141a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:14.956 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:14.952+0000 7f9f1141a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:14.956 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:14.952+0000 7f9f1141a8c0 -1 bdev(0x55ef67ed8c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:14.956 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:14.952+0000 7f9f1141a8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:55:17.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:55:17.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:17.493 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:55:17.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:55:17.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:17.624 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:55:17.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:55:17.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:17.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:17.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:17.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:17.642 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:17.636+0000 7f98062588c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:17.645 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:17.640+0000 7f98062588c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:17.653 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:17.644+0000 7f98062588c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:17.786 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:55:17.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:17.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:17.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:17.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:17.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:18.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:18.108+0000 7f98062588c0 -1 Falling back to public interface 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:18.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:19.103 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:19.100+0000 7f98062588c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:55:19.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:20.152 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:55:20.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:20.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:20.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:20.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:20.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:20.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:21.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:55:21.555 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3631974448,v1:127.0.0.1:6803/3631974448] [v2:127.0.0.1:6804/3631974448,v1:127.0.0.1:6805/3631974448] exists,up b2693d37-3421-42de-b8ee-5b150f7af365 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:21.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:21.557 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:21.557 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:21.557 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:21.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:55:21.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:21.560 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:55:21.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:55:21.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 4f288625-5fe0-4545-a3f6-17a5bc564768' 2026-03-08T22:55:21.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:21.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBZ/q1pfM4vIhAAnOdEHI5O1RPCzH3MOOL0Bg== 2026-03-08T22:55:21.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBZ/q1pfM4vIhAAnOdEHI5O1RPCzH3MOOL0Bg=="}' 2026-03-08T22:55:21.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4f288625-5fe0-4545-a3f6-17a5bc564768 -i td/osd-scrub-repair/1/new.json 2026-03-08T22:55:21.847 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:55:21.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:55:21.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBZ/q1pfM4vIhAAnOdEHI5O1RPCzH3MOOL0Bg== --osd-uuid 4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:55:21.881 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:21.876+0000 7f22dc58f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:21.883 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:21.880+0000 7f22dc58f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:21.884 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:21.880+0000 7f22dc58f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:21.884 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:21.880+0000 7f22dc58f8c0 -1 bdev(0x555eb576dc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:21.884 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:21.880+0000 7f22dc58f8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:55:24.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:55:24.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:24.141 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:55:24.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:55:24.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:24.363 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:55:24.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:55:24.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:24.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:24.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:24.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:24.386 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:24.381+0000 7f8462dd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:24.387 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:24.385+0000 7f8462dd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:24.389 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:24.385+0000 7f8462dd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:24.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:25.751 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:55:25.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:25.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:25.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:25.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:25.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:25.843 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:25.841+0000 7f8462dd28c0 -1 Falling back to public interface 2026-03-08T22:55:25.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:26.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:27.073 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:27.069+0000 7f8462dd28c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:55:27.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:28.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1849217127,v1:127.0.0.1:6811/1849217127] [v2:127.0.0.1:6812/1849217127,v1:127.0.0.1:6813/1849217127] exists,up 4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:280: auto_repair_erasure_coded: for id in $(seq 0 2) 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:281: auto_repair_erasure_coded: run_osd td/osd-scrub-repair 2 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:28.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:55:28.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:55:28.325 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 3c7874c0-9743-48c2-b1c6-cd3841cfe800 2026-03-08T22:55:28.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3c7874c0-9743-48c2-b1c6-cd3841cfe800 2026-03-08T22:55:28.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 3c7874c0-9743-48c2-b1c6-cd3841cfe800' 2026-03-08T22:55:28.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:55:28.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBg/q1pNlgpFBAAqZ2J0VITjN65u5EzmJ4Apw== 2026-03-08T22:55:28.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBg/q1pNlgpFBAAqZ2J0VITjN65u5EzmJ4Apw=="}' 2026-03-08T22:55:28.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3c7874c0-9743-48c2-b1c6-cd3841cfe800 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:55:28.507 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:55:28.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:55:28.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBg/q1pNlgpFBAAqZ2J0VITjN65u5EzmJ4Apw== --osd-uuid 3c7874c0-9743-48c2-b1c6-cd3841cfe800 2026-03-08T22:55:28.548 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:28.545+0000 7f468d20a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.550 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:28.549+0000 7f468d20a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.551 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:28.549+0000 7f468d20a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:28.552 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:28.549+0000 7f468d20a8c0 -1 bdev(0x5624246bfc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:55:28.552 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:28.549+0000 7f468d20a8c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:55:30.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:55:30.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:55:30.808 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:55:30.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:55:30.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:55:31.031 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:55:31.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:55:31.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:31.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:55:31.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:55:31.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:55:31.051 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:31.045+0000 7ff57793a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:31.051 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:31.049+0000 7ff57793a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:31.053 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:31.049+0000 7ff57793a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:31.228 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:31.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:31.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:31.503 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:31.501+0000 7ff57793a8c0 -1 Falling back to public interface 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:32.508 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:32.505+0000 7ff57793a8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:55:32.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:33.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:33.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:34.813 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:55:34.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:34.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:34.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:34.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:34.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:55:34.991 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3083262254,v1:127.0.0.1:6819/3083262254] [v2:127.0.0.1:6820/3083262254,v1:127.0.0.1:6821/3083262254] exists,up 3c7874c0-9743-48c2-b1c6-cd3841cfe800 2026-03-08T22:55:34.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:34.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:34.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:34.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:283: auto_repair_erasure_coded: create_rbd_pool 2026-03-08T22:55:34.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:55:35.163 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T22:55:35.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:55:35.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:55:35.410 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T22:55:35.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:55:36.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:55:36.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:284: auto_repair_erasure_coded: wait_for_clean 2026-03-08T22:55:36.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:36.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:36.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:36.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:36.731 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:36.731 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:36.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:36.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:36.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:36.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:36.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:37.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T22:55:37.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T22:55:37.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T22:55:37.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:37.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:37.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T22:55:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T22:55:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963' 2026-03-08T22:55:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:37.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:37.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T22:55:37.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T22:55:37.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963 2-64424509442' 2026-03-08T22:55:37.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:37.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T22:55:37.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:37.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:37.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T22:55:37.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:37.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T22:55:37.234 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T22:55:37.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T22:55:37.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:37.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836485 2026-03-08T22:55:37.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:38.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:38.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:38.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T22:55:38.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:38.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T22:55:38.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:38.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:38.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T22:55:38.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:38.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T22:55:38.597 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T22:55:38.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T22:55:38.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:38.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T22:55:38.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:38.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T22:55:38.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:38.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T22:55:38.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:38.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T22:55:38.777 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T22:55:38.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T22:55:38.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:38.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T22:55:38.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:38.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:38.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:39.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:39.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:39.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:39.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:55:39.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:39.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:39.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:287: auto_repair_erasure_coded: create_ec_pool ecpool true k=2 m=1 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T22:55:39.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 2026-03-08T22:55:39.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T22:55:39.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T22:55:40.130 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T22:55:40.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:55:41.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T22:55:41.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T22:55:41.365 INFO:tasks.workunit.client.0.vm03.stderr:set pool 2 allow_ec_overwrites to true 2026-03-08T22:55:41.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T22:55:41.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:41.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:41.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:41.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:41.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:41.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T22:55:41.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T22:55:41.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T22:55:41.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:41.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:41.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T22:55:41.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T22:55:41.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672965' 2026-03-08T22:55:41.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:41.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:41.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T22:55:41.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T22:55:41.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672965 2-64424509444' 2026-03-08T22:55:41.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:41.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T22:55:41.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:41.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:41.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T22:55:41.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:41.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T22:55:41.878 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T22:55:41.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T22:55:41.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:42.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T22:55:42.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:43.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:43.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T22:55:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:44.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:55:44.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:44.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T22:55:44.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:44.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T22:55:44.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:44.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T22:55:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:44.410 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T22:55:44.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T22:55:44.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T22:55:44.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:44.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672965 2026-03-08T22:55:44.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:44.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T22:55:44.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:44.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:44.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T22:55:44.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:44.588 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509444 2026-03-08T22:55:44.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T22:55:44.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T22:55:44.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:44.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509444 2026-03-08T22:55:44.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:44.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:44.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:45.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:45.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:45.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:45.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:45.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:45.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:290: auto_repair_erasure_coded: local payload=ABCDEF 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:291: auto_repair_erasure_coded: echo ABCDEF 2026-03-08T22:55:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:292: auto_repair_erasure_coded: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:296: auto_repair_erasure_coded: get_not_primary ecpool SOMETHING 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=ecpool 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary ecpool SOMETHING 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:45.415 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:55:45.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=2 2026-03-08T22:55:45.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:45.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 2)) | .[0]' 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:296: auto_repair_erasure_coded: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:45.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:55:46.555 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#-3:00000000:::scrub_1.0:head#, (61) No data available 2026-03-08T22:55:46.555 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:55:47.090 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:55:47.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:55:47.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:55:47.092 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:55:47.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:55:47.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:55:47.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:55:47.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:55:47.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:55:47.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:55:47.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:47.105+0000 7fa42b4b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:47.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:47.109+0000 7fa42b4b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:47.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:47.109+0000 7fa42b4b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:55:47.279 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:55:47.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:47.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:47.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:48.315 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:48.313+0000 7fa42b4b68c0 -1 Falling back to public interface 2026-03-08T22:55:48.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:48.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:48.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:55:48.469 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:48.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:48.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:48.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:49.297 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:55:49.293+0000 7fa42b4b68c0 -1 osd.1 30 log_to_monitors true 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:49.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:49.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:55:50.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:55:50.993 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 34 up_thru 34 down_at 31 last_clean_interval [10,30) [v2:127.0.0.1:6810/220066867,v1:127.0.0.1:6811/220066867] [v2:127.0.0.1:6812/220066867,v1:127.0.0.1:6813/220066867] exists,up 4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:55:50.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:55:50.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:50.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:51.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:51.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:51.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836490 2026-03-08T22:55:51.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836490 2026-03-08T22:55:51.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490' 2026-03-08T22:55:51.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:51.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:51.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T22:55:51.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T22:55:51.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-146028888066' 2026-03-08T22:55:51.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:51.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:51.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509447 2026-03-08T22:55:51.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509447 2026-03-08T22:55:51.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-146028888066 2-64424509447' 2026-03-08T22:55:51.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:51.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836490 2026-03-08T22:55:51.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:51.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:51.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836490 2026-03-08T22:55:51.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:51.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836490 2026-03-08T22:55:51.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836490' 2026-03-08T22:55:51.479 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836490 2026-03-08T22:55:51.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836490 2026-03-08T22:55:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:52.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:52.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836490 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:52.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888066 2026-03-08T22:55:52.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:52.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888066 2026-03-08T22:55:52.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:52.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T22:55:52.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888066' 2026-03-08T22:55:52.832 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 146028888066 2026-03-08T22:55:52.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:53.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888066 -lt 146028888066 2026-03-08T22:55:53.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:53.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509447 2026-03-08T22:55:53.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:53.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:53.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509447 2026-03-08T22:55:53.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:53.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509447 2026-03-08T22:55:53.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509447' 2026-03-08T22:55:53.007 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509447 2026-03-08T22:55:53.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:53.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509447 -lt 64424509447 2026-03-08T22:55:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:53.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:53.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:53.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:53.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:53.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:53.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:53.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:298: auto_repair_erasure_coded: get_pg ecpool SOMETHING 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:53.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:298: auto_repair_erasure_coded: local pgid=2.0 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:299: auto_repair_erasure_coded: get_last_scrub_stamp 2.0 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:53.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:299: auto_repair_erasure_coded: wait_for_scrub 2.0 2026-03-08T22:55:45.489114+0000 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:55:45.489114+0000 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:55:54.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:54.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:54.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:54.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:54.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:54.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:54.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:55:45.489114+0000 '>' 2026-03-08T22:55:45.489114+0000 2026-03-08T22:55:54.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:55.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:55:45.489114+0000 '>' 2026-03-08T22:55:45.489114+0000 2026-03-08T22:55:55.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:55:56.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:55:54.552112+0000 '>' 2026-03-08T22:55:45.489114+0000 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:300: auto_repair_erasure_coded: wait_for_clean 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:55:56.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:55:56.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:56.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:55:57.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836492 2026-03-08T22:55:57.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836492 2026-03-08T22:55:57.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492' 2026-03-08T22:55:57.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:57.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:55:57.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888068 2026-03-08T22:55:57.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888068 2026-03-08T22:55:57.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-146028888068' 2026-03-08T22:55:57.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:55:57.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:55:57.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509449 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509449 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-146028888068 2-64424509449' 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836492 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836492 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:57.196 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836492 2026-03-08T22:55:57.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836492 2026-03-08T22:55:57.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836492' 2026-03-08T22:55:57.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:57.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836492 2026-03-08T22:55:57.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:55:58.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:55:58.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:55:58.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836492 2026-03-08T22:55:58.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:58.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-146028888068 2026-03-08T22:55:58.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:58.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:55:58.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-146028888068 2026-03-08T22:55:58.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:58.542 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 146028888068 2026-03-08T22:55:58.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888068 2026-03-08T22:55:58.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 146028888068' 2026-03-08T22:55:58.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:55:58.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888068 2026-03-08T22:55:58.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:55:58.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509449 2026-03-08T22:55:58.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:55:58.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:55:58.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509449 2026-03-08T22:55:58.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:55:58.733 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509449 2026-03-08T22:55:58.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509449 2026-03-08T22:55:58.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509449' 2026-03-08T22:55:58.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:55:58.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509449 -lt 64424509449 2026-03-08T22:55:58.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:55:58.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:58.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:55:59.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:55:59.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:55:59.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:55:59.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:55:59.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:55:59.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:55:59.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:55:59.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:55:59.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:303: auto_repair_erasure_coded: get_not_primary ecpool SOMETHING 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=ecpool 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary ecpool SOMETHING 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:59.513 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:55:59.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=2 2026-03-08T22:55:59.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:55:59.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 2)) | .[0]' 2026-03-08T22:55:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:303: auto_repair_erasure_coded: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:55:59.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:55:59.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:56:00.321 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T22:56:00.321 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:56:00.321 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:56:00.321 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:56:00.321 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#-3:00000000:::scrub_1.0:head#, (61) No data available 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:00.611 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T22:56:00.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:56:00.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:56:00.613 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:56:00.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-auto-repair=true --osd-deep-scrub-interval=5 --osd-scrub-max-interval=5 --osd-scrub-min-interval=5 --osd-scrub-interval-randomize-ratio=0 2026-03-08T22:56:00.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:56:00.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:56:00.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:56:00.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:56:00.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:56:00.634 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:00.629+0000 7f1776fdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:00.634 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:00.633+0000 7f1776fdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:00.636 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:00.633+0000 7f1776fdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:00.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:00.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:01.834 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:01.833+0000 7f1776fdc8c0 -1 Falling back to public interface 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:02.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:02.836 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:02.833+0000 7f1776fdc8c0 -1 osd.1 35 log_to_monitors true 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:03.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:03.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:03.979 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:03.977+0000 7f176df8c640 -1 osd.1 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:04.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:04.526 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 39 up_thru 39 down_at 36 last_clean_interval [34,35) [v2:127.0.0.1:6810/2443742223,v1:127.0.0.1:6811/2443742223] [v2:127.0.0.1:6812/2443742223,v1:127.0.0.1:6813/2443742223] exists,up 4f288625-5fe0-4545-a3f6-17a5bc564768 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:04.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:04.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:04.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:04.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836494 2026-03-08T22:56:04.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836494 2026-03-08T22:56:04.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494' 2026-03-08T22:56:04.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:04.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:04.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724546 2026-03-08T22:56:04.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724546 2026-03-08T22:56:04.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-167503724546' 2026-03-08T22:56:04.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:04.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509452 2026-03-08T22:56:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509452 2026-03-08T22:56:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-167503724546 2-64424509452' 2026-03-08T22:56:05.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:05.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836494 2026-03-08T22:56:05.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:05.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:05.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836494 2026-03-08T22:56:05.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:05.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836494 2026-03-08T22:56:05.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836494' 2026-03-08T22:56:05.030 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836494 2026-03-08T22:56:05.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:05.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836494 2026-03-08T22:56:05.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:06.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:06.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:06.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836494 2026-03-08T22:56:06.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:06.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-167503724546 2026-03-08T22:56:06.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:06.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:06.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-167503724546 2026-03-08T22:56:06.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:06.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724546 2026-03-08T22:56:06.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 167503724546' 2026-03-08T22:56:06.384 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 167503724546 2026-03-08T22:56:06.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:06.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724546 -lt 167503724546 2026-03-08T22:56:06.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:06.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509452 2026-03-08T22:56:06.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:06.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:06.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509452 2026-03-08T22:56:06.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509452 2026-03-08T22:56:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509452' 2026-03-08T22:56:06.563 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509452 2026-03-08T22:56:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:06.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509452 -lt 64424509452 2026-03-08T22:56:06.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:06.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:06.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:06.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:07.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:07.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:07.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:07.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:07.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:07.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:07.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:07.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:304: auto_repair_erasure_coded: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:56:07.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:305: auto_repair_erasure_coded: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:07.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:07.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:07.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:07.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:07.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:56:07.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:07.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:07.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:07.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:07.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:07.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:07.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:56:07.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:07.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:07.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:07.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:07.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:07.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:07.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:07.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:56:07.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:56:07.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:56:07.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:56:07.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:56:07.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:56:07.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:07.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:56:07.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:56:07.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:56:07.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:56:07.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:56:07.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:56:07.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:56:07.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:56:07.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:56:07.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:56:07.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:56:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:56:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:56:07.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_and_repair_jerasure_appends td/osd-scrub-repair 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:815: TEST_corrupt_and_repair_jerasure_appends: corrupt_and_repair_jerasure td/osd-scrub-repair false 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:798: corrupt_and_repair_jerasure: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:799: corrupt_and_repair_jerasure: local allow_overwrites=false 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:800: corrupt_and_repair_jerasure: local poolname=ecpool 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:802: corrupt_and_repair_jerasure: run_mon td/osd-scrub-repair a 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:56:07.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:56:07.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:56:07.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:07.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:07.555 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:07.556 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.556 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:07.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:56:07.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:56:07.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:56:07.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:07.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:07.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:56:07.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:56:07.583 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:07.583 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:07.583 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:07.583 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:07.584 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.584 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.584 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:56:07.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:07.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:56:07.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:56:07.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:56:07.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:56:07.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.654 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:56:07.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:56:07.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:803: corrupt_and_repair_jerasure: run_mgr td/osd-scrub-repair x 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:56:07.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:07.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:56:07.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:56:07.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: seq 0 3 2026-03-08T22:56:07.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:56:07.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 0 2026-03-08T22:56:07.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:07.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:07.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:07.857 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:07.858 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:07.858 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:07.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:07.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:07.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:07.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:56:07.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:07.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=36c0e936-bcc4-4654-a9c9-c1f55734e2b6 2026-03-08T22:56:07.864 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 36c0e936-bcc4-4654-a9c9-c1f55734e2b6 2026-03-08T22:56:07.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 36c0e936-bcc4-4654-a9c9-c1f55734e2b6' 2026-03-08T22:56:07.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCH/q1piLpKNBAALmk89sOngiSI7Mad3vIO6g== 2026-03-08T22:56:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCH/q1piLpKNBAALmk89sOngiSI7Mad3vIO6g=="}' 2026-03-08T22:56:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 36c0e936-bcc4-4654-a9c9-c1f55734e2b6 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:56:07.993 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:56:08.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:56:08.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCH/q1piLpKNBAALmk89sOngiSI7Mad3vIO6g== --osd-uuid 36c0e936-bcc4-4654-a9c9-c1f55734e2b6 2026-03-08T22:56:08.022 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:08.021+0000 7f54ad8838c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:08.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:08.021+0000 7f54ad8838c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:08.026 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:08.021+0000 7f54ad8838c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:08.026 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:08.025+0000 7f54ad8838c0 -1 bdev(0x55b76207ac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:08.026 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:08.025+0000 7f54ad8838c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:56:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:56:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:56:10.323 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:56:10.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:10.429 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:56:10.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:56:10.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:10.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:10.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:10.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:10.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:10.469+0000 7f84785408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:10.486 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:10.485+0000 7f84785408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:10.497 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:10.489+0000 7f84785408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:10.563 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:56:10.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:10.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:10.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:10.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:11.710 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:11.709+0000 7f84785408c0 -1 Falling back to public interface 2026-03-08T22:56:11.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:12.672 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:12.669+0000 7f84785408c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:56:12.887 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:56:12.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:12.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:12.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:12.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:12.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:13.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:13.780 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:13.777+0000 7f8473cf9640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:56:14.079 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:56:14.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:14.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:14.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:14.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:14.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:56:14.252 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/248909995,v1:127.0.0.1:6803/248909995] [v2:127.0.0.1:6804/248909995,v1:127.0.0.1:6805/248909995] exists,up 36c0e936-bcc4-4654-a9c9-c1f55734e2b6 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 1 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:14.253 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:14.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:56:14.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:14.256 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:56:14.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:56:14.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 4b11297b-7757-4601-a282-873f7025f4c9' 2026-03-08T22:56:14.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCO/q1pU4ogEBAA6MImHbIqs0QXTn6siMJGqQ== 2026-03-08T22:56:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCO/q1pU4ogEBAA6MImHbIqs0QXTn6siMJGqQ=="}' 2026-03-08T22:56:14.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4b11297b-7757-4601-a282-873f7025f4c9 -i td/osd-scrub-repair/1/new.json 2026-03-08T22:56:14.437 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:56:14.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:56:14.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCO/q1pU4ogEBAA6MImHbIqs0QXTn6siMJGqQ== --osd-uuid 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:56:14.469 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:14.469+0000 7fa76c90b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:14.475 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:14.469+0000 7fa76c90b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:14.475 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:14.469+0000 7fa76c90b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:14.475 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:14.469+0000 7fa76c90b8c0 -1 bdev(0x5651661a5c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:14.475 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:14.469+0000 7fa76c90b8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:56:16.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:56:16.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:16.987 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:56:16.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:56:16.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:17.199 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:56:17.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:56:17.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:17.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:17.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:17.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:17.217 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:17.214+0000 7fb1a7cfb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:17.217 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:17.214+0000 7fb1a7cfb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:17.219 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:17.218+0000 7fb1a7cfb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:17.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:17.682 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:17.682+0000 7fb1a7cfb8c0 -1 Falling back to public interface 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:18.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:18.773 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:18.770+0000 7fb1a7cfb8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:56:18.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:19.816 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:56:19.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:19.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:19.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:19.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:19.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:19.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:20.289 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:20.286+0000 7fb1a34b4640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:20.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:56:21.175 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3077357416,v1:127.0.0.1:6811/3077357416] [v2:127.0.0.1:6812/3077357416,v1:127.0.0.1:6813/3077357416] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 2 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:21.176 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:21.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:56:21.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:21.179 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:56:21.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:56:21.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 1ee90050-97fb-4d10-895d-8d12b78ca851' 2026-03-08T22:56:21.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:21.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCV/q1pCYyFCxAAipspO1etf4laj/HsOTqfIA== 2026-03-08T22:56:21.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCV/q1pCYyFCxAAipspO1etf4laj/HsOTqfIA=="}' 2026-03-08T22:56:21.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1ee90050-97fb-4d10-895d-8d12b78ca851 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:56:21.370 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:56:21.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:56:21.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCV/q1pCYyFCxAAipspO1etf4laj/HsOTqfIA== --osd-uuid 1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:56:21.403 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:21.402+0000 7ff93f3f08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.406 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:21.406+0000 7ff93f3f08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.407 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:21.406+0000 7ff93f3f08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:21.407 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:21.406+0000 7ff93f3f08c0 -1 bdev(0x5650e62cfc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:21.407 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:21.406+0000 7ff93f3f08c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:56:24.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:56:24.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:24.152 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:56:24.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:56:24.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:24.363 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:56:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:56:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:24.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:24.386 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:24.386+0000 7f23d37178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:24.390+0000 7f23d37178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.392 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:24.390+0000 7f23d37178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:24.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:24.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:25.590 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:25.590+0000 7f23d37178c0 -1 Falling back to public interface 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:25.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:25.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:27.079 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:27.078+0000 7f23d37178c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:56:27.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:28.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:28.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:29.338 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T22:56:29.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:29.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:29.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:56:29.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:29.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/37463171,v1:127.0.0.1:6819/37463171] [v2:127.0.0.1:6820/37463171,v1:127.0.0.1:6821/37463171] exists,up 1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 3 2026-03-08T22:56:29.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:29.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:56:29.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:56:29.525 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:56:29.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:56:29.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 2e2f6557-d446-402f-b97c-10a3115fde24' 2026-03-08T22:56:29.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:56:29.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCd/q1pBoEnIBAAHSVbUevIlE1H/XAJxFU/7Q== 2026-03-08T22:56:29.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCd/q1pBoEnIBAAHSVbUevIlE1H/XAJxFU/7Q=="}' 2026-03-08T22:56:29.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2e2f6557-d446-402f-b97c-10a3115fde24 -i td/osd-scrub-repair/3/new.json 2026-03-08T22:56:29.792 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:56:29.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T22:56:29.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCd/q1pBoEnIBAAHSVbUevIlE1H/XAJxFU/7Q== --osd-uuid 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:56:29.823 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:29.822+0000 7feb893c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:29.831 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:29.830+0000 7feb893c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:29.834 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:29.830+0000 7feb893c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:29.834 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:29.834+0000 7feb893c38c0 -1 bdev(0x5564b4ba5c00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:56:29.834 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:29.834+0000 7feb893c38c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T22:56:32.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T22:56:32.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:56:32.108 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T22:56:32.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:56:32.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:56:32.325 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T22:56:32.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:56:32.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:32.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:56:32.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:56:32.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:56:32.341 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:32.338+0000 7ff8fd16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:32.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:32.346+0000 7ff8fd16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:32.347 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:32.346+0000 7ff8fd16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:32.524 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:32.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:33.539 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:33.538+0000 7ff8fd16b8c0 -1 Falling back to public interface 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:33.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:33.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:34.528 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:34.526+0000 7ff8fd16b8c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:56:34.892 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:56:34.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:34.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:34.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:34.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:34.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:35.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:36.262 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3730779925,v1:127.0.0.1:6827/3730779925] [v2:127.0.0.1:6828/3730779925,v1:127.0.0.1:6829/3730779925] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:56:36.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:36.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:36.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:36.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:807: corrupt_and_repair_jerasure: create_rbd_pool 2026-03-08T22:56:36.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:56:36.509 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T22:56:36.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:56:36.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:56:36.852 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T22:56:36.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:56:37.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:808: corrupt_and_repair_jerasure: wait_for_clean 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:38.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:38.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:38.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:38.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:56:38.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:56:38.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:56:38.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:38.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:38.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T22:56:38.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T22:56:38.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965' 2026-03-08T22:56:38.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:38.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:38.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T22:56:38.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T22:56:38.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443' 2026-03-08T22:56:38.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:38.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:39.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345922 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345922 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443 3-85899345922' 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:56:39.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:56:39.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:39.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T22:56:39.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:39.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T22:56:39.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:39.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:39.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T22:56:39.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:39.224 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T22:56:39.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T22:56:39.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T22:56:39.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:39.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672965 2026-03-08T22:56:39.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:40.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:40.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:40.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672965 2026-03-08T22:56:40.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:41.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:56:41.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:41.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T22:56:41.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:41.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T22:56:41.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:41.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:41.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T22:56:41.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:41.757 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T22:56:41.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T22:56:41.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T22:56:41.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:41.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509443 2026-03-08T22:56:41.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:41.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345922 2026-03-08T22:56:41.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:41.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:41.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345922 2026-03-08T22:56:41.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:41.984 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345922 2026-03-08T22:56:41.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345922 2026-03-08T22:56:41.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345922' 2026-03-08T22:56:41.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:42.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345922 -lt 85899345922 2026-03-08T22:56:42.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:42.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:42.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:42.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:42.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:56:42.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:42.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:42.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:810: corrupt_and_repair_jerasure: create_ec_pool ecpool false k=2 m=2 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T22:56:42.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=2 2026-03-08T22:56:43.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T22:56:43.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T22:56:43.415 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T22:56:43.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:44.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:44.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:44.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:44.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836488 2026-03-08T22:56:44.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836488 2026-03-08T22:56:44.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488' 2026-03-08T22:56:44.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:44.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:44.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672967 2026-03-08T22:56:44.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672967 2026-03-08T22:56:44.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967' 2026-03-08T22:56:44.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:44.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:44.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509446 2026-03-08T22:56:44.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509446 2026-03-08T22:56:44.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967 2-64424509446' 2026-03-08T22:56:44.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:44.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345924 2026-03-08T22:56:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345924 2026-03-08T22:56:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967 2-64424509446 3-85899345924' 2026-03-08T22:56:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836488 2026-03-08T22:56:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:44.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:44.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836488 2026-03-08T22:56:44.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:44.993 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836488 2026-03-08T22:56:44.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836488 2026-03-08T22:56:44.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836488' 2026-03-08T22:56:44.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:45.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836488 2026-03-08T22:56:45.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:46.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:46.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:46.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836488 2026-03-08T22:56:46.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:47.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:56:47.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:47.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836488 2026-03-08T22:56:47.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:47.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672967 2026-03-08T22:56:47.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:47.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:47.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672967 2026-03-08T22:56:47.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:47.483 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672967 2026-03-08T22:56:47.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672967 2026-03-08T22:56:47.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672967' 2026-03-08T22:56:47.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:47.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672967 2026-03-08T22:56:47.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:47.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509446 2026-03-08T22:56:47.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:47.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:47.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509446 2026-03-08T22:56:47.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:47.747 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509446 2026-03-08T22:56:47.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509446 2026-03-08T22:56:47.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509446' 2026-03-08T22:56:47.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:48.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509446 -lt 64424509446 2026-03-08T22:56:48.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:48.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345924 2026-03-08T22:56:48.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:48.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:48.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345924 2026-03-08T22:56:48.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:48.041 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345924 2026-03-08T22:56:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345924 2026-03-08T22:56:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345924' 2026-03-08T22:56:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:48.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345924 -lt 85899345924 2026-03-08T22:56:48.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:48.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:48.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:48.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:48.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:48.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:48.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:48.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:48.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:48.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:811: corrupt_and_repair_jerasure: corrupt_and_repair_erasure_coded td/osd-scrub-repair ecpool 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:248: corrupt_and_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:249: corrupt_and_repair_erasure_coded: local poolname=ecpool 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:251: corrupt_and_repair_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T22:56:48.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T22:56:49.024 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T22:56:49.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T22:56:49.227 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T22:56:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T22:56:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T22:56:49.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:56:49.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T22:56:49.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:56:49.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:56:49.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:56:49.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:56:49.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: local primary=3 2026-03-08T22:56:49.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T22:56:49.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:56:49.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:56:49.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: sed -e s/3// 2026-03-08T22:56:49.457 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:56:49.458 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 0 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: osds=('1' '2' '0') 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: local -a osds 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:255: corrupt_and_repair_erasure_coded: local not_primary_first=1 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:256: corrupt_and_repair_erasure_coded: local not_primary_second=2 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:259: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 3 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=3 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:56:49.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:56:49.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:56:49.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:56:49.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:56:49.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:56:49.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:56:49.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T22:56:50.377 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:56:50.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:56:50.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:56:50.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:56:50.910 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:56:50.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:56:50.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:56:50.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:56:50.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:56:50.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:56:50.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:56:50.928 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:50.926+0000 7f8a920bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:50.928 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:50.926+0000 7f8a920bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:50.930 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:50.930+0000 7f8a920bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:56:51.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:56:51.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:51.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:51.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:51.890 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:51.890+0000 7f8a920bd8c0 -1 Falling back to public interface 2026-03-08T22:56:52.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:52.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:52.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:56:52.265 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:52.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:52.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:52.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:52.879 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:52.878+0000 7f8a920bd8c0 -1 osd.3 35 log_to_monitors true 2026-03-08T22:56:53.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:53.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:53.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:56:53.437 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:53.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:53.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:53.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:56:54.237 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:56:54.238+0000 7f8a8906d640 -1 osd.3 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:56:54.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 39 up_thru 39 down_at 36 last_clean_interval [20,35) [v2:127.0.0.1:6826/548828593,v1:127.0.0.1:6827/548828593] [v2:127.0.0.1:6828/548828593,v1:127.0.0.1:6829/548828593] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:56:54.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:56:54.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:56:54.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:55.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:56:55.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836492 2026-03-08T22:56:55.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836492 2026-03-08T22:56:55.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492' 2026-03-08T22:56:55.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:55.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:56:55.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T22:56:55.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T22:56:55.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-42949672970' 2026-03-08T22:56:55.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:55.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:56:55.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509449 2026-03-08T22:56:55.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509449 2026-03-08T22:56:55.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-42949672970 2-64424509449' 2026-03-08T22:56:55.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:56:55.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724546 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724546 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-42949672970 2-64424509449 3-167503724546' 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836492 2026-03-08T22:56:55.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:55.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:56:55.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836492 2026-03-08T22:56:55.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:55.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836492 2026-03-08T22:56:55.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836492' 2026-03-08T22:56:55.329 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836492 2026-03-08T22:56:55.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:55.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836492 2026-03-08T22:56:55.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:56:56.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:56:56.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:56:56.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836492 2026-03-08T22:56:56.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:56.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T22:56:56.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:56.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:56:56.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T22:56:56.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:56.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T22:56:56.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T22:56:56.674 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672970 2026-03-08T22:56:56.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:56:56.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672971 -lt 42949672970 2026-03-08T22:56:56.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:56.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509449 2026-03-08T22:56:56.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:56.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:56:56.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509449 2026-03-08T22:56:56.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:56.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509449 2026-03-08T22:56:56.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509449' 2026-03-08T22:56:56.847 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509449 2026-03-08T22:56:56.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:56:57.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509449 -lt 64424509449 2026-03-08T22:56:57.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:56:57.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-167503724546 2026-03-08T22:56:57.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:56:57.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:56:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-167503724546 2026-03-08T22:56:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:56:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724546 2026-03-08T22:56:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 167503724546' 2026-03-08T22:56:57.014 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 167503724546 2026-03-08T22:56:57.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:56:57.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724546 -lt 167503724546 2026-03-08T22:56:57.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:56:57.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:57.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:57.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:56:57.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:56:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:56:57.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:56:57.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:56:57.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:56:57.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:56:57.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:56:57.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:56:57.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:56:57.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T22:56:57.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:56:57.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:56:57.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:56:57.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:56:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T22:56:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:56:57.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:56:58.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:56:43.415229+0000 2026-03-08T22:56:58.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:56:58.239 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:56:58.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:56:43.415229+0000 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:56:43.415229+0000 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:56:58.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:56:58.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:56:58.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:56:59.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:56:59.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:56:59.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:00.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:00.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:57:00.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:01.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:01.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:57:01.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:02.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:57:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:04.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:43.415229+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:57:04.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:05.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:58.834305+0000 '>' 2026-03-08T22:56:43.415229+0000 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:05.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:05.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T22:57:05.959 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:57:05.959 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:57:05.959 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:57:05.959 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:06.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:06.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:57:06.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:57:06.245 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:57:06.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:06.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:57:06.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:57:06.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:06.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:06.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:06.261 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:06.262+0000 7f5b795fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:06.266 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:06.266+0000 7f5b795fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:06.272 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:06.266+0000 7f5b795fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:06.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:06.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:07.474 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:07.474+0000 7f5b795fd8c0 -1 Falling back to public interface 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:07.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:07.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:08.497 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:08.498+0000 7f5b795fd8c0 -1 osd.3 40 log_to_monitors true 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:08.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:08.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:09.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 44 up_thru 44 down_at 41 last_clean_interval [39,40) [v2:127.0.0.1:6826/4256948340,v1:127.0.0.1:6827/4256948340] [v2:127.0.0.1:6828/4256948340,v1:127.0.0.1:6829/4256948340] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:10.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:10.133 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:10.133 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:10.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:10.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:10.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:10.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:10.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:10.386 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:10.387 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:10.387 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:57:10.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:10.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:10.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:10.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T22:57:10.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T22:57:10.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T22:57:10.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:10.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:10.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T22:57:10.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T22:57:10.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672974' 2026-03-08T22:57:10.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:10.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509453 2026-03-08T22:57:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509453 2026-03-08T22:57:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672974 2-64424509453' 2026-03-08T22:57:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:10.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561026 2026-03-08T22:57:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561026 2026-03-08T22:57:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672974 2-64424509453 3-188978561026' 2026-03-08T22:57:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:10.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T22:57:10.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:10.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:10.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T22:57:10.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:10.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T22:57:10.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T22:57:10.705 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836496 2026-03-08T22:57:10.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:10.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T22:57:10.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:10.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T22:57:10.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:10.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:10.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T22:57:10.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:10.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T22:57:10.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T22:57:10.877 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672974 2026-03-08T22:57:10.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:11.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672974 -lt 42949672974 2026-03-08T22:57:11.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:11.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509453 2026-03-08T22:57:11.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:11.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:11.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509453 2026-03-08T22:57:11.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:11.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509453 2026-03-08T22:57:11.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509453' 2026-03-08T22:57:11.053 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509453 2026-03-08T22:57:11.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509452 -lt 64424509453 2026-03-08T22:57:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:12.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:12.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:12.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509452 -lt 64424509453 2026-03-08T22:57:12.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:13.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:57:13.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:13.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509453 -lt 64424509453 2026-03-08T22:57:13.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:13.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-188978561026 2026-03-08T22:57:13.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:13.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:13.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-188978561026 2026-03-08T22:57:13.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561026 2026-03-08T22:57:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 188978561026' 2026-03-08T22:57:13.568 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 188978561026 2026-03-08T22:57:13.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:13.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561026 -lt 188978561026 2026-03-08T22:57:13.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:13.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:13.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:13.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:13.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:13.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:14.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:14.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:14.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:14.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:14.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:14.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:14.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:57:14.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:261: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 1 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=1 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:14.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:57:15.085 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:15.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:57:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:15.622 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:57:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:15.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:57:15.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:15.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:15.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:15.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:15.638 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:15.639+0000 7f38fc0e98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:15.639 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:15.639+0000 7f38fc0e98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:15.641 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:15.639+0000 7f38fc0e98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:15.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:15.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:15.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:16.105 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:16.107+0000 7f38fc0e98c0 -1 Falling back to public interface 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:16.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:17.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:17.579 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:17.579+0000 7f38fc0e98c0 -1 osd.1 46 log_to_monitors true 2026-03-08T22:57:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:18.138 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:18.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:19.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:19.474 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 50 up_thru 50 down_at 47 last_clean_interval [10,46) [v2:127.0.0.1:6810/2619672807,v1:127.0.0.1:6811/2619672807] [v2:127.0.0.1:6812/2619672807,v1:127.0.0.1:6813/2619672807] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:19.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:19.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:19.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:19.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:19.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836498 2026-03-08T22:57:19.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836498 2026-03-08T22:57:19.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498' 2026-03-08T22:57:19.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:19.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:19.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364802 2026-03-08T22:57:19.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364802 2026-03-08T22:57:19.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-214748364802' 2026-03-08T22:57:19.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:19.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:19.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509456 2026-03-08T22:57:19.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509456 2026-03-08T22:57:19.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-214748364802 2-64424509456' 2026-03-08T22:57:19.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:19.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561029 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561029 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-214748364802 2-64424509456 3-188978561029' 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836498 2026-03-08T22:57:20.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:20.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836498 2026-03-08T22:57:20.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:20.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836498 2026-03-08T22:57:20.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836498' 2026-03-08T22:57:20.006 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836498 2026-03-08T22:57:20.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:20.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836498 2026-03-08T22:57:20.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:21.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:21.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:21.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836499 -lt 21474836498 2026-03-08T22:57:21.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:21.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-214748364802 2026-03-08T22:57:21.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-214748364802 2026-03-08T22:57:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364802 2026-03-08T22:57:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 214748364802' 2026-03-08T22:57:21.324 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 214748364802 2026-03-08T22:57:21.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364802 -lt 214748364802 2026-03-08T22:57:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:21.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509456 2026-03-08T22:57:21.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:21.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:21.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509456 2026-03-08T22:57:21.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:21.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509456 2026-03-08T22:57:21.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509456' 2026-03-08T22:57:21.490 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509456 2026-03-08T22:57:21.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:21.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509456 -lt 64424509456 2026-03-08T22:57:21.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:21.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-188978561029 2026-03-08T22:57:21.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:21.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:21.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-188978561029 2026-03-08T22:57:21.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:21.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561029 2026-03-08T22:57:21.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 188978561029' 2026-03-08T22:57:21.656 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 188978561029 2026-03-08T22:57:21.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:21.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561029 -lt 188978561029 2026-03-08T22:57:21.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:21.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:21.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:22.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:22.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:22.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:22.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:22.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:57:22.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:57:22.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:57:22.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:22.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:22.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:22.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:22.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:57:22.860 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:22.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:23.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:58.834305+0000 '>' 2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:23.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:24.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:24.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:56:58.834305+0000 '>' 2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:24.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:56:58.834305+0000 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:25.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:25.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:25.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:25.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:25.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T22:57:25.787 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T22:57:25.787 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:57:25.787 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:57:25.787 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:26.072 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:57:26.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:57:26.074 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:57:26.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:26.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:57:26.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:57:26.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:26.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:26.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:26.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:26.091+0000 7f78431ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:26.091+0000 7f78431ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.092 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:26.091+0000 7f78431ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:26.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:26.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:27.037 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:27.039+0000 7f78431ca8c0 -1 Falling back to public interface 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:27.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:27.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:28.052 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:57:28.055+0000 7f78431ca8c0 -1 osd.3 52 log_to_monitors true 2026-03-08T22:57:28.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:28.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:28.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:28.565 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:28.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:28.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:28.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 56 up_thru 56 down_at 53 last_clean_interval [44,52) [v2:127.0.0.1:6826/3347390512,v1:127.0.0.1:6827/3347390512] [v2:127.0.0.1:6828/3347390512,v1:127.0.0.1:6829/3347390512] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:29.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:29.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:30.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836502 2026-03-08T22:57:30.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836502 2026-03-08T22:57:30.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502' 2026-03-08T22:57:30.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:30.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:30.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364805 2026-03-08T22:57:30.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364805 2026-03-08T22:57:30.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-214748364805' 2026-03-08T22:57:30.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:30.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:30.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509459 2026-03-08T22:57:30.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509459 2026-03-08T22:57:30.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-214748364805 2-64424509459' 2026-03-08T22:57:30.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:30.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168578 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168578 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836502 1-214748364805 2-64424509459 3-240518168578' 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836502 2026-03-08T22:57:30.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:30.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:30.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836502 2026-03-08T22:57:30.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:30.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836502 2026-03-08T22:57:30.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836502' 2026-03-08T22:57:30.471 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836502 2026-03-08T22:57:30.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:30.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836502 2026-03-08T22:57:30.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:31.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:31.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:31.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836502 -lt 21474836502 2026-03-08T22:57:31.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:31.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-214748364805 2026-03-08T22:57:31.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:31.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:31.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-214748364805 2026-03-08T22:57:31.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:31.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364805 2026-03-08T22:57:31.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 214748364805' 2026-03-08T22:57:31.792 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 214748364805 2026-03-08T22:57:31.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:31.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364805 -lt 214748364805 2026-03-08T22:57:31.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:31.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509459 2026-03-08T22:57:31.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:31.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:31.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509459 2026-03-08T22:57:31.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:31.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509459 2026-03-08T22:57:31.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509459' 2026-03-08T22:57:31.953 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509459 2026-03-08T22:57:31.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:32.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509459 -lt 64424509459 2026-03-08T22:57:32.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:32.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168578 2026-03-08T22:57:32.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:32.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:32.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168578 2026-03-08T22:57:32.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:32.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168578 2026-03-08T22:57:32.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168578' 2026-03-08T22:57:32.118 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 240518168578 2026-03-08T22:57:32.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:32.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168578 -lt 240518168578 2026-03-08T22:57:32.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:32.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:32.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:32.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:32.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:32.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:32.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:32.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:32.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:32.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:32.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:32.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:32.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:32.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:57:32.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:262: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 1 2 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=1 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=2 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 128766"' 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 128766' 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 128768"' 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 128768' 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 128766 128768' 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 128766 2026-03-08T22:57:32.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/128769: /' 2026-03-08T22:57:32.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/128771: /' 2026-03-08T22:57:32.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:32.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:57:34.175 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING remove 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: remove 2#2:eb822e21:::SOMETHING:head# 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:34.176 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:34.177 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: start osd.2 2026-03-08T22:57:34.178 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:34.183 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:34.184 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:34.198 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: start osd.1 2026-03-08T22:57:34.199 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:34.223+0000 7fe00f29e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:34.243+0000 7fe00f29e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:34.247+0000 7fe00f29e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: 0 2026-03-08T22:57:38.024 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 1 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:35.727+0000 7fe00f29e8c0 -1 Falling back to public interface 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:36.795+0000 7fe00f29e8c0 -1 osd.1 58 log_to_monitors true 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2026-03-08T22:57:37.791+0000 7fe00624e640 -1 osd.1 58 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: 3 2026-03-08T22:57:38.025 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2026-03-08T22:57:34.199+0000 7fef60f168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2026-03-08T22:57:34.219+0000 7fef60f168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2026-03-08T22:57:34.235+0000 7fef60f168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 0 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2026-03-08T22:57:35.191+0000 7fef60f168c0 -1 Falling back to public interface 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2026-03-08T22:57:36.159+0000 7fef60f168c0 -1 osd.2 58 log_to_monitors true 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: 3 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:38.131 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: osd.2 up in weight 1 up_from 63 up_thru 63 down_at 59 last_clean_interval [15,58) [v2:127.0.0.1:6810/997989780,v1:127.0.0.1:6811/997989780] [v2:127.0.0.1:6812/997989780,v1:127.0.0.1:6813/997989780] exists,up 1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: 1 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: 2 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: 3' 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836504 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836504 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504' 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874242 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874242 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-279172874242' 2026-03-08T22:57:38.550 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:38.623 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tellcephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: osd.1 up in weight 1 up_from 65 up_thru 65 down_at 59 last_clean_interval [50,58) [v2:127.0.0.1:6818/1875894856,v1:127.0.0.1:6819/1875894856] [v2:127.0.0.1:6820/1875894856,v1:127.0.0.1:6821/1875894856] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: 1 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: 2 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: 3' 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874243 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874243 2026-03-08T22:57:38.624 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-279172874243' 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone osd.2 flush_pg_stats 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939650 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939650 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-279172874242 2-270582939650' 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168581 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168581 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-279172874242 2-270582939650 3-240518168581' 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836504 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836504 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836504 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836504' 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: waiting osd.0 seq 21474836504 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836504 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-279172874242 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-279172874242 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874242 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 279172874242' 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: waiting osd.1 seq 279172874242 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874242 -lt 279172874242 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-270582939650 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:39.101 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:40.337 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939651 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939651 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-279172874243 2-270582939651' 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168582 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168582 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-279172874243 2-270582939651 3-240518168582' 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: waiting osd.0 seq 21474836505 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-279172874243 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-279172874243 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874243 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 279172874243' 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: waiting osd.1 seq 279172874243 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874242 -lt 279172874243 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:40.338 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_p.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 270582939650' 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: waiting osd.2 seq 270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939651 -lt 270582939650 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168581 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168581 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168581 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168581' 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: waiting osd.3 seq 240518168581 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168582 -lt 240518168581 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:41.988 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("cg_stats: test 279172874242 -lt 279172874243 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874243 -lt 279172874243 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-270582939651 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-270582939651 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939651 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 270582939651' 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: waiting osd.2 seq 270582939651 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939651 -lt 270582939651 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:42.104 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168582 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168582 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168582 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168582' 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: waiting osd.3 seq 240518168582 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168582 -lt 240518168582 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:42.105 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") lean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:128771: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:42.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:| not)' 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:42.501 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:128769: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:128769: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 128768 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:57:42.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:57:42.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:42.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:42.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:42.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:57:42.984 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:42.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:43.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:43.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:44.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:44.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:44.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:44.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:45.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:45.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:45.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:46.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:46.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:46.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:47.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:23.396187+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:47.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:57:48.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:57:48.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:43.902694+0000 '>' 2026-03-08T22:57:23.396187+0000 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 131691"' 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 131691' 2026-03-08T22:57:49.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 131692"' 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 131692' 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 131691 131692' 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 131691 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/131694: /' 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/131696: /' 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:57:49.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:57:49.760 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.760 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:49.760 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:49.760 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: _ 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: hinfo_key 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: snapset 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:49.761 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING list-attrs 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: _ 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: hinfo_key 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: snapset 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:49.762 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:49.764 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: start osd.1 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--logs.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: start osd.2 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T22:57:49.765 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2026-03-08T22:57:49.795+0000 7f7989a228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2026-03-08T22:57:49.811+0000 7f7989a228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2026-03-08T22:57:49.819+0000 7f7989a228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 0 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2026-03-08T22:57:51.031+0000 7f7989a228c0 -1 Falling back to public interface 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2026-03-08T22:57:51.763+0000 7f7989a228c0 -1 osd.2 67 log_to_monitors true 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: 3 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.725 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131696: osd.2 up in weight 1 up_from 71 up_thru 63 down_a-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2026-03-08T22:57:49.803+0000 7efc611bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2026-03-08T22:57:49.819+0000 7efc611bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2026-03-08T22:57:49.827+0000 7efc611bb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 0 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2026-03-08T22:57:50.771+0000 7efc611bb8c0 -1 Falling back to public interface 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2026-03-08T22:57:51.779+0000 7efc611bb8c0 -1 osd.1 67 log_to_monitors true 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: 3 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:57:53.763 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131694: osd.1 up in weight 1 up_from 71 up_thru 71 down_at 68 last_clean_interval [63,67) [v2:127.0.0.1:6818/511919408,v1:127.0.0.1:6819/511919408] [v2:127.0.0.1:6820/511919408,v1:127.0.0.1:6821/511919408] exists,up 1ee90050-97fb-4d10-895d-8d12b78ca851 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: 1 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: 2 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: 3' 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836509 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836509 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509' 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678018 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678018 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-304942678018' 2026-03-08T22:57:54.147 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: t 68 last_clean_interval [65,67) [v2:127.0.0.1:6810/2683959364,v1:127.0.0.1:6811/2683959364] [v2:127.0.0.1:6812/2683959364,v1:127.0.0.1:6813/2683959364] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: 1 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: 2 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: 3' 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836510 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836510 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510' 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678019 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678019 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-304942678019' 2026-03-08T22:57:54.201 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_sta ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-304942678018 2-304942678018' 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168586 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168586 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-304942678018 2-304942678018 3-240518168586' 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836509' 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: waiting osd.0 seq 21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836509 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 304942678018' 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: waiting osd.1 seq 304942678018 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:55.863 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678019 -lt 304942678018 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqts: ceph tell osd.2 flush_pg_stats 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-304942678019 2-304942678019' 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168587 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168587 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-304942678019 2-304942678019 3-240518168587' 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836510' 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: waiting osd.0 seq 21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836510 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 304942678019' 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: waiting osd.1 seq 304942678019 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:57:55.929 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678019 -lt 304942678019 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in s 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678018 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678018 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678018 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678018' 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: waiting osd.2 seq 304942678018 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678019 -lt 304942678018 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168586 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168586 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168586 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168586' 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: waiting osd.3 seq 240518168586 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168587 -lt 240518168586 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:56.683 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:56.703 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/c$seqs 2026-03-08T22:57:56.703 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678019 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678019 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678019 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678019' 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: waiting osd.2 seq 304942678019 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678019 -lt 304942678019 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168587 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168587 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168587 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168587' 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: waiting osd.3 seq 240518168587 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168587 -lt 240518168587 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:57:56.704 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtelone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:131696: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:131696: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:56.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:st/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:131694: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:131694: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 131692 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T22:57:56.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:57:56.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:263: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 3 1 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=3 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=1 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 134006"' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 134006' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 134008"' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 134008' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 134006 134008' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 134006 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/134009: /' 2026-03-08T22:57:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/134011: /' 2026-03-08T22:57:56.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:57:56.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:57:58.439 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:58.440 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:58.441 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: start osd.3 2026-03-08T22:57:58.442 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:57:58.459 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:57:58.460 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:57:58.460 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:57:58.460 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:57:58.460 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:57:58.460 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:57:58.462 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: start osd.1 2026-03-08T22:57:58.463 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2026-03-08T22:57:58.459+0000 7f47060e48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2026-03-08T22:57:58.463+0000 7f47060e48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2026-03-08T22:57:58.471+0000 7f47060e48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:02.349 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 0 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2026-03-08T22:57:59.431+0000 7f47060e48c0 -1 Falling back to public interface 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2026-03-08T22:58:00.395+0000 7f47060e48c0 -1 osd.3 73 log_to_monitors true 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: 3 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.350 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134009: osd.3 up in weight 1 up_from 77 up_thru 77 down_at 74 last_-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2026-03-08T22:57:58.495+0000 7f52dfb308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2026-03-08T22:57:58.511+0000 7f52dfb308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2026-03-08T22:57:58.519+0000 7f52dfb308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:02.435 INFO:tasks.workunit.client.0.vm03.stderr:134011: 0 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2026-03-08T22:57:59.699+0000 7f52dfb308c0 -1 Falling back to public interface 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: 1 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2026-03-08T22:58:00.411+0000 7f52dfb308c0 -1 osd.1 73 log_to_monitors true 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: 3 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:02.436 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134011: osd.1 up in weight 1 up_from 77 up_thru 77 down_at 74 last_clean_interval [56,73) [v2:127.0.0.1:6810/1995622450,v1:127.0.0.1:6811/1995622450] [v2:127.0.0.1:6812/1995622450,v1:127.0.0.1:6813/1995622450] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: 1 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: 2 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: 3' 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:02.758 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836513 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836513 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513' 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481794 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481794 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-330712481794' 2026-03-08T22:58:02.759 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph clean_interval [71,73) [v2:127.0.0.1:6826/3926095806,v1:127.0.0.1:6827/3926095806] [v2:127.0.0.1:6828/3926095806,v1:127.0.0.1:6829/3926095806] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: 1 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: 2 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: 3' 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836514 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836514 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514' 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481795 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481795 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-330712481795' 2026-03-08T22:58:02.880 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678022 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678022 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-330712481794 2-304942678022' 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481794 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481794 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-330712481794 2-304942678022 3-330712481794' 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836513' 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: waiting osd.0 seq 21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836513 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-330712481794 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-330712481794 2026-03-08T22:58:05.497 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481794 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.tell osd.2 flush_pg_stats 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678023 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678023 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-330712481795 2-304942678023' 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481795 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481795 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-330712481795 2-304942678023 3-330712481795' 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836514 2026-03-08T22:58:05.585 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836514' 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: waiting osd.0 seq 21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836514 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-330712481795 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-330712481795 2026-03-08T22:58:05.586 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481795 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 330712481794' 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: waiting osd.1 seq 330712481794 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481794 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678022 2026-03-08T22:58:06.260 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678022 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678022 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678022' 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: waiting osd.2 seq 304942678022 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678023 -lt 304942678022 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-330712481794 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-330712481794 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481794 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 330712481794' 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: waiting osd.3 seq 330712481794 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481794 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:06.261 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:06.380 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("acsh:2276: flush_pg_stats: echo 'waiting osd.1 seq 330712481795' 2026-03-08T22:58:06.380 INFO:tasks.workunit.client.0.vm03.stderr:134011: waiting osd.1 seq 330712481795 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481795 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678023 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678023 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678023 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678023' 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: waiting osd.2 seq 304942678023 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678023 -lt 304942678023 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-330712481795 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:06.381 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-330712481795 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481795 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 330712481795' 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: waiting osd.3 seq 330712481795 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481795 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:06.382 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:06.650 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:134009: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:58:06.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 134008 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:tive") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:134011: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T22:58:06.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T22:58:06.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:58:06.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:58:06.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:58:06.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:58:06.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:58:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:58:07.278 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:58:07.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:58:07.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:58:07.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:58:07.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:43.902694+0000 '>' 2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:07.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:58:08.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:58:08.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:58:08.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:43.902694+0000 '>' 2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:08.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:58:09.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:58:09.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:57:43.902694+0000 '>' 2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:09.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:58:10.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:07.447850+0000 '>' 2026-03-08T22:57:43.902694+0000 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 136883"' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 136883' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 136884"' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 136884' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 136883 136884' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 136883 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/136886: /' 2026-03-08T22:58:10.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/136888: /' 2026-03-08T22:58:10.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:58:10.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: _ 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: hinfo_key 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: snapset 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:58:11.672 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:58:11.674 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:58:11.675 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:58:11.675 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:58:11.675 INFO:tasks.workunit.client.0.vm03.stderr:136886: start osd.3 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:58:11.919 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: _ 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: hinfo_key 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: snapset 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:11.920 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:11.925 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: start osd.1 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:58:11.926 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--loglen=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2026-03-08T22:58:11.692+0000 7fac613698c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2026-03-08T22:58:11.696+0000 7fac613698c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2026-03-08T22:58:11.696+0000 7fac613698c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:15.395 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 0 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2026-03-08T22:58:12.668+0000 7fac613698c0 -1 Falling back to public interface 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 1 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2026-03-08T22:58:13.672+0000 7fac613698c0 -1 osd.3 78 log_to_monitors true 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: 3 2026-03-08T22:58:15.396 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2026-03-08T22:58:11.960+0000 7f85ac6ef8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2026-03-08T22:58:11.976+0000 7f85ac6ef8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2026-03-08T22:58:11.984+0000 7f85ac6ef8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: 0 2026-03-08T22:58:15.889 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: 1 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2026-03-08T22:58:13.444+0000 7f85ac6ef8c0 -1 Falling back to public interface 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2026-03-08T22:58:14.876+0000 7f85ac6ef8c0 -1 osd.1 78 log_to_monitors true 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: 3 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:15.890 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136888: osd.1 up in weight 1 up_from 86 up_thru 77 down_as.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: osd.3 up in weight 1 up_from 83 up_thru 83 down_at 79 last_clean_interval [77,78) [v2:127.0.0.1:6810/402300859,v1:127.0.0.1:6811/402300859] [v2:127.0.0.1:6812/402300859,v1:127.0.0.1:6813/402300859] exists,up 2e2f6557-d446-402f-b97c-10a3115fde24 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: 1 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: 2 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: 3' 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836518 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836518 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518' 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=369367187458 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 369367187458 2026-03-08T22:58:16.067 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-369367187458' 2026-03-08T22:58:16.338 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $it 79 last_clean_interval [77,78) [v2:127.0.0.1:6826/2382016516,v1:127.0.0.1:6827/2382016516] [v2:127.0.0.1:6828/2382016516,v1:127.0.0.1:6829/2382016516] exists,up 4b11297b-7757-4601-a282-873f7025f4c9 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: 1 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: 2 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: 3' 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836519 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836519 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519' 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=369367187459 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 369367187459 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-369367187459' 2026-03-08T22:58:16.339 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:17.791 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stads 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678026 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678026 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-369367187458 2-304942678026' 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=356482285570 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 356482285570 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-369367187458 2-304942678026 3-356482285570' 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836518' 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: waiting osd.0 seq 21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836516 -lt 21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836519 -lt 21474836518 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-369367187458 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-369367187458 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=369367187458 2026-03-08T22:58:17.792 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 369367187458' 2026-03-08T22:58:17.793 INFO:tasks.workunit.client.0.vm03.stderr:136886: waiting osd.1 seq 369367187458 2026-03-08T22:58:17.793 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:17.793 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 369367187459 -lt 369367187458 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136886: /hts: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678027 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678027 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-369367187459 2-304942678027' 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=356482285571 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 356482285571 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-369367187459 2-304942678027 3-356482285571' 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836519' 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: waiting osd.0 seq 21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836516 -lt 21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836519 -lt 21474836519 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.060 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-369367187459 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-369367187459 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=369367187459 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 369367187459' 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: waiting osd.1 seq 369367187459 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:18.061 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 369367187459 -lt 369367187459 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in ome/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678026 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678026 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678026 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678026' 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: waiting osd.2 seq 304942678026 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678027 -lt 304942678026 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-356482285570 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-356482285570 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=356482285570 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 356482285570' 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: waiting osd.3 seq 356482285570 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 356482285571 -lt 356482285570 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:18.591 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:136886: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:58:18.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 136884 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:$seqs 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-304942678027 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-304942678027 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678027 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 304942678027' 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: waiting osd.2 seq 304942678027 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678027 -lt 304942678027 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-356482285571 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-356482285571 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=356482285571 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 356482285571' 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: waiting osd.3 seq 356482285571 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 356482285571 -lt 356482285571 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:18.848 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:18.849 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:19.055 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:19.055 INFO:tasks.workunit.client.0.vm03.stderr:136888: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:136888: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T22:58:19.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:58:19.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:19.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:19.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:19.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:19.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:19.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:19.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:19.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:19.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:19.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:58:19.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:19.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:19.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:19.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:19.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:19.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:19.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:19.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:19.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:58:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:58:19.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:19.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:19.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:19.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T22:58:19.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T22:58:19.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T22:58:19.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T22:58:19.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T22:58:19.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T22:58:19.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:19.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T22:58:19.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T22:58:19.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T22:58:19.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T22:58:19.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T22:58:19.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T22:58:19.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T22:58:19.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T22:58:19.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T22:58:19.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T22:58:19.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T22:58:19.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T22:58:19.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_and_repair_jerasure_overwrites td/osd-scrub-repair 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:819: TEST_corrupt_and_repair_jerasure_overwrites: '[' true = true ']' 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:820: TEST_corrupt_and_repair_jerasure_overwrites: corrupt_and_repair_jerasure td/osd-scrub-repair true 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:798: corrupt_and_repair_jerasure: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:799: corrupt_and_repair_jerasure: local allow_overwrites=true 2026-03-08T22:58:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:800: corrupt_and_repair_jerasure: local poolname=ecpool 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:802: corrupt_and_repair_jerasure: run_mon td/osd-scrub-repair a 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T22:58:19.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.280 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:19.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T22:58:19.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T22:58:19.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T22:58:19.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:19.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:19.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T22:58:19.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.313 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.314 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:58:19.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:19.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T22:58:19.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T22:58:19.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T22:58:19.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T22:58:19.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T22:58:19.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.384 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.386 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T22:58:19.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T22:58:19.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:803: corrupt_and_repair_jerasure: run_mgr td/osd-scrub-repair x 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T22:58:19.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.562 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:19.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:19.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T22:58:19.582 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: seq 0 3 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 0 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:19.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:19.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T22:58:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:19.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=55325780-72ce-430d-85b2-bfd3492bb358 2026-03-08T22:58:19.592 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 55325780-72ce-430d-85b2-bfd3492bb358 2026-03-08T22:58:19.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 55325780-72ce-430d-85b2-bfd3492bb358' 2026-03-08T22:58:19.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:19.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAL/61pXXs+JBAAUD6/o1yJgX5WRDy70SocQw== 2026-03-08T22:58:19.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAL/61pXXs+JBAAUD6/o1yJgX5WRDy70SocQw=="}' 2026-03-08T22:58:19.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 55325780-72ce-430d-85b2-bfd3492bb358 -i td/osd-scrub-repair/0/new.json 2026-03-08T22:58:19.705 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:58:19.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T22:58:19.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAL/61pXXs+JBAAUD6/o1yJgX5WRDy70SocQw== --osd-uuid 55325780-72ce-430d-85b2-bfd3492bb358 2026-03-08T22:58:19.733 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:19.736+0000 7faea17b48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:19.735 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:19.736+0000 7faea17b48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:19.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:19.740+0000 7faea17b48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:19.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:19.740+0000 7faea17b48c0 -1 bdev(0x55d3f3d26c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:19.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:19.740+0000 7faea17b48c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T22:58:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T22:58:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:21.997 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T22:58:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T22:58:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:22.107 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T22:58:22.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T22:58:22.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:22.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:22.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:22.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:22.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:22.160+0000 7f675cfca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:22.165 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:22.168+0000 7f675cfca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:22.175 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:22.172+0000 7f675cfca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:58:22.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:22.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:22.908 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:22.912+0000 7f675cfca8c0 -1 Falling back to public interface 2026-03-08T22:58:23.374 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:58:23.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:23.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:23.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:23.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:23.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:23.872 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:23.876+0000 7f675cfca8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T22:58:24.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:24.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:24.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:24.549 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:58:24.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:24.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:24.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:24.857 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:24.860+0000 7f6758783640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4278099112,v1:127.0.0.1:6803/4278099112] [v2:127.0.0.1:6804/4278099112,v1:127.0.0.1:6805/4278099112] exists,up 55325780-72ce-430d-85b2-bfd3492bb358 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 1 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:58:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:25.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:58:25.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:25.904 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:58:25.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:58:25.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 5ef39c9d-4d00-4ccf-aa53-af311ceb2231' 2026-03-08T22:58:25.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:25.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAR/61p9ukBNxAASCJCFPkyRBYYWjbDBZaNJQ== 2026-03-08T22:58:25.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAR/61p9ukBNxAASCJCFPkyRBYYWjbDBZaNJQ=="}' 2026-03-08T22:58:25.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 -i td/osd-scrub-repair/1/new.json 2026-03-08T22:58:26.078 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:58:26.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T22:58:26.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAR/61p9ukBNxAASCJCFPkyRBYYWjbDBZaNJQ== --osd-uuid 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:58:26.110 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:26.112+0000 7fc4ece6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.112 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:26.116+0000 7fc4ece6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.112 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:26.116+0000 7fc4ece6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:26.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:26.116+0000 7fc4ece6a8c0 -1 bdev(0x55e68f0a7c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:26.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:26.116+0000 7fc4ece6a8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T22:58:28.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T22:58:28.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:28.612 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T22:58:28.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T22:58:28.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:28.817 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T22:58:28.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T22:58:28.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:28.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:28.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:28.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:28.832 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:28.836+0000 7f86fcb2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:28.833 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:28.836+0000 7f86fcb2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:28.835 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:28.836+0000 7f86fcb2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:28.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:29.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:29.548 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:29.552+0000 7f86fcb2b8c0 -1 Falling back to public interface 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:30.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:30.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:30.515 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:30.520+0000 7f86fcb2b8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:31.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:58:32.746 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3928599447,v1:127.0.0.1:6811/3928599447] [v2:127.0.0.1:6812/3928599447,v1:127.0.0.1:6813/3928599447] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 2 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:32.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:58:32.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:32.750 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T22:58:32.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T22:58:32.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 79fcfabc-4e85-41f1-b208-de561eec9f60' 2026-03-08T22:58:32.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:32.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAY/61pdFXYLRAAFIupCQLK6+veHbDQEbL2Tg== 2026-03-08T22:58:32.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAY/61pdFXYLRAAFIupCQLK6+veHbDQEbL2Tg=="}' 2026-03-08T22:58:32.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 79fcfabc-4e85-41f1-b208-de561eec9f60 -i td/osd-scrub-repair/2/new.json 2026-03-08T22:58:32.934 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:58:32.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T22:58:32.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAY/61pdFXYLRAAFIupCQLK6+veHbDQEbL2Tg== --osd-uuid 79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T22:58:32.965 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:32.968+0000 7f4ce8abe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:32.967 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:32.972+0000 7f4ce8abe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:32.967 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:32.972+0000 7f4ce8abe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:32.968 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:32.972+0000 7f4ce8abe8c0 -1 bdev(0x55c4abc05c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:32.968 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:32.972+0000 7f4ce8abe8c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T22:58:35.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T22:58:35.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:35.245 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T22:58:35.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T22:58:35.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:35.456 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T22:58:35.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T22:58:35.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:35.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:35.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:35.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:35.475 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:35.476+0000 7f08790568c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:35.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:35.488+0000 7f08790568c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:35.485 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:35.488+0000 7f08790568c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:35.643 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:35.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:35.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:35.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:35.944+0000 7f08790568c0 -1 Falling back to public interface 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:36.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:37.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:37.798 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:37.796+0000 7f08790568c0 -1 osd.2 0 log_to_monitors true 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:38.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:38.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:39.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:39.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:40.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:58:40.571 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2673635604,v1:127.0.0.1:6819/2673635604] [v2:127.0.0.1:6820/2673635604,v1:127.0.0.1:6821/2673635604] exists,up 79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:804: corrupt_and_repair_jerasure: for id in $(seq 0 3) 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:805: corrupt_and_repair_jerasure: run_osd td/osd-scrub-repair 3 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:40.572 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:58:40.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T22:58:40.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:58:40.574 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:58:40.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 15f78b88-fe52-405a-a6d1-07f48fd457b8' 2026-03-08T22:58:40.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T22:58:40.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAg/61p0Z4yIxAAqnEj/TM0uGm7IdLOCmPvrQ== 2026-03-08T22:58:40.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAg/61p0Z4yIxAAqnEj/TM0uGm7IdLOCmPvrQ=="}' 2026-03-08T22:58:40.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 15f78b88-fe52-405a-a6d1-07f48fd457b8 -i td/osd-scrub-repair/3/new.json 2026-03-08T22:58:40.743 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:58:40.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T22:58:40.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAg/61p0Z4yIxAAqnEj/TM0uGm7IdLOCmPvrQ== --osd-uuid 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:58:40.772 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:40.772+0000 7fd8cb9148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:40.774 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:40.772+0000 7fd8cb9148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:40.775 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:40.776+0000 7fd8cb9148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:40.775 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:40.776+0000 7fd8cb9148c0 -1 bdev(0x55d089eabc00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T22:58:40.775 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:40.776+0000 7fd8cb9148c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T22:58:43.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T22:58:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T22:58:43.042 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T22:58:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T22:58:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T22:58:43.238 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T22:58:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T22:58:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T22:58:43.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T22:58:43.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T22:58:43.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:43.260+0000 7fc1133748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:43.275 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:43.276+0000 7fc1133748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:43.277 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:43.276+0000 7fc1133748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:43.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:44.251 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:44.252+0000 7fc1133748c0 -1 Falling back to public interface 2026-03-08T22:58:44.563 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T22:58:44.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:44.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:44.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:44.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:44.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:44.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:45.712 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:45.712+0000 7fc1133748c0 -1 osd.3 0 log_to_monitors true 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:45.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:45.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:46.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3028970888,v1:127.0.0.1:6827/3028970888] [v2:127.0.0.1:6828/3028970888,v1:127.0.0.1:6829/3028970888] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:807: corrupt_and_repair_jerasure: create_rbd_pool 2026-03-08T22:58:47.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T22:58:47.207 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T22:58:47.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T22:58:47.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T22:58:47.421 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T22:58:47.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:58:48.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T22:58:48.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:808: corrupt_and_repair_jerasure: wait_for_clean 2026-03-08T22:58:48.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:48.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:48.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:48.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:48.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:48.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:48.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:48.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:48.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:48.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:49.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T22:58:49.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T22:58:49.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T22:58:49.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:49.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:49.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T22:58:49.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T22:58:49.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965' 2026-03-08T22:58:49.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:49.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:49.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T22:58:49.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T22:58:49.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443' 2026-03-08T22:58:49.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:49.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:49.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345922 2026-03-08T22:58:49.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345922 2026-03-08T22:58:49.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443 3-85899345922' 2026-03-08T22:58:49.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:49.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T22:58:49.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:49.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:49.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T22:58:49.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:49.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T22:58:49.233 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T22:58:49.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T22:58:49.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:49.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T22:58:49.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:50.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:50.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:50.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836486 2026-03-08T22:58:50.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:50.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T22:58:50.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:50.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:50.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T22:58:50.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:50.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T22:58:50.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T22:58:50.550 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T22:58:50.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:50.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T22:58:50.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:50.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T22:58:50.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:50.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:50.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T22:58:50.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:50.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T22:58:50.715 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T22:58:50.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T22:58:50.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:50.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509443 2026-03-08T22:58:50.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:50.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345922 2026-03-08T22:58:50.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:50.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:50.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345922 2026-03-08T22:58:50.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345922 2026-03-08T22:58:50.881 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345922 2026-03-08T22:58:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345922' 2026-03-08T22:58:50.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:51.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345922 -lt 85899345922 2026-03-08T22:58:51.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:51.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:51.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:51.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:51.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:51.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T22:58:51.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:51.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:51.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:810: corrupt_and_repair_jerasure: create_ec_pool ecpool true k=2 m=2 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T22:58:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=2 2026-03-08T22:58:51.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T22:58:51.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T22:58:52.239 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T22:58:52.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T22:58:53.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T22:58:53.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T22:58:53.447 INFO:tasks.workunit.client.0.vm03.stderr:set pool 2 allow_ec_overwrites to true 2026-03-08T22:58:53.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:58:53.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:58:53.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:53.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:58:53.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836488 2026-03-08T22:58:53.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836488 2026-03-08T22:58:53.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488' 2026-03-08T22:58:53.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:53.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:58:53.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672967 2026-03-08T22:58:53.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672967 2026-03-08T22:58:53.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967' 2026-03-08T22:58:53.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:53.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:58:53.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509445 2026-03-08T22:58:53.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509445 2026-03-08T22:58:53.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967 2-64424509445' 2026-03-08T22:58:53.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:58:53.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:58:53.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345924 2026-03-08T22:58:53.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345924 2026-03-08T22:58:53.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-42949672967 2-64424509445 3-85899345924' 2026-03-08T22:58:53.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:53.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836488 2026-03-08T22:58:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:53.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:58:53.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836488 2026-03-08T22:58:53.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:53.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836488 2026-03-08T22:58:53.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836488' 2026-03-08T22:58:53.989 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836488 2026-03-08T22:58:53.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:54.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836488 2026-03-08T22:58:54.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:58:55.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:58:55.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:58:55.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836488 2026-03-08T22:58:55.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672967 2026-03-08T22:58:55.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:58:55.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672967 2026-03-08T22:58:55.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672967 2026-03-08T22:58:55.307 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672967 2026-03-08T22:58:55.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672967' 2026-03-08T22:58:55.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:58:55.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672967 -lt 42949672967 2026-03-08T22:58:55.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509445 2026-03-08T22:58:55.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:58:55.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509445 2026-03-08T22:58:55.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509445 2026-03-08T22:58:55.468 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509445 2026-03-08T22:58:55.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509445' 2026-03-08T22:58:55.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:58:55.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509445 2026-03-08T22:58:55.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:58:55.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345924 2026-03-08T22:58:55.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:58:55.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:58:55.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345924 2026-03-08T22:58:55.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:58:55.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345924 2026-03-08T22:58:55.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345924' 2026-03-08T22:58:55.627 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345924 2026-03-08T22:58:55.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:58:55.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345924 -lt 85899345924 2026-03-08T22:58:55.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:58:55.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:55.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:58:55.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:58:56.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:58:56.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:58:56.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:58:56.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:811: corrupt_and_repair_jerasure: corrupt_and_repair_erasure_coded td/osd-scrub-repair ecpool 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:248: corrupt_and_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:249: corrupt_and_repair_erasure_coded: local poolname=ecpool 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:251: corrupt_and_repair_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T22:58:56.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T22:58:56.540 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T22:58:56.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T22:58:56.748 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T22:58:56.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T22:58:56.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T22:58:56.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T22:58:56.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T22:58:56.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T22:58:56.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T22:58:56.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:58:56.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: local primary=3 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: sed -e s/3// 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T22:58:56.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:58:56.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 1 2 0 2026-03-08T22:58:57.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: osds=('1' '2' '0') 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: local -a osds 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:255: corrupt_and_repair_erasure_coded: local not_primary_first=1 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:256: corrupt_and_repair_erasure_coded: local not_primary_second=2 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:259: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 3 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=3 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:58:57.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:58:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T22:58:57.873 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:58:58.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:58:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:58:58.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:58:58.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:58:58.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:58:58.424 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:58.420+0000 7f56ba21b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:58.426 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:58.428+0000 7f56ba21b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:58.428 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:58.428+0000 7f56ba21b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:58.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:58.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:58:59.371 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:58:59.372+0000 7f56ba21b8c0 -1 Falling back to public interface 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:58:59.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:58:59.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:00.347 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:00.349+0000 7f56ba21b8c0 -1 osd.3 37 log_to_monitors true 2026-03-08T22:59:00.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:00.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:00.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:00.910 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:00.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:00.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:01.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:01.389 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:01.389+0000 7f56b11cb640 -1 osd.3 37 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:02.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:02.260 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 41 up_thru 41 down_at 38 last_clean_interval [20,37) [v2:127.0.0.1:6826/1740458421,v1:127.0.0.1:6827/1740458421] [v2:127.0.0.1:6828/1740458421,v1:127.0.0.1:6829/1740458421] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:02.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:02.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:02.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:02.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836491 2026-03-08T22:59:02.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836491 2026-03-08T22:59:02.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491' 2026-03-08T22:59:02.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:02.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:02.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T22:59:02.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T22:59:02.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970' 2026-03-08T22:59:02.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:02.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:02.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509448 2026-03-08T22:59:02.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509448 2026-03-08T22:59:02.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448' 2026-03-08T22:59:02.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:02.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659138 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659138 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491 1-42949672970 2-64424509448 3-176093659138' 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836491 2026-03-08T22:59:02.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:02.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:02.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836491 2026-03-08T22:59:02.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836491 2026-03-08T22:59:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836491' 2026-03-08T22:59:02.787 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836491 2026-03-08T22:59:02.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:02.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836491 2026-03-08T22:59:02.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:03.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:03.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:04.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836491 2026-03-08T22:59:04.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T22:59:05.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:05.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836491 -lt 21474836491 2026-03-08T22:59:05.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:05.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T22:59:05.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:05.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:05.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T22:59:05.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:05.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T22:59:05.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T22:59:05.299 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672970 2026-03-08T22:59:05.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:05.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T22:59:05.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:05.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509448 2026-03-08T22:59:05.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:05.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:05.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509448 2026-03-08T22:59:05.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:05.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509448 2026-03-08T22:59:05.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509448' 2026-03-08T22:59:05.490 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509448 2026-03-08T22:59:05.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:05.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509448 -lt 64424509448 2026-03-08T22:59:05.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:05.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-176093659138 2026-03-08T22:59:05.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:05.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-176093659138 2026-03-08T22:59:05.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:05.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659138 2026-03-08T22:59:05.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 176093659138' 2026-03-08T22:59:05.672 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 176093659138 2026-03-08T22:59:05.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:05.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659138 -lt 176093659138 2026-03-08T22:59:05.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:05.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:05.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:06.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:06.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:06.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:06.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:06.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:06.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:06.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:59:06.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:59:06.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:59:06.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:59:06.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T22:59:06.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:06.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:06.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:06.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:59:06.946 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:06.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:07.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:52.243916+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:07.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:08.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:08.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:08.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:52.243916+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:08.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:09.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:09.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:52.243916+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:09.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:10.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:52.243916+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:10.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:11.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:11.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:58:52.243916+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:11.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:12.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:12.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:12.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:12.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:12.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:12.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:12.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:07.396604+0000 '>' 2026-03-08T22:58:52.243916+0000 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:13.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:59:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T22:59:13.435 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T22:59:13.436 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:59:13.436 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:59:13.436 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:59:13.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:59:13.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:13.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:13.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:13.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:59:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:59:13.725 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:59:13.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:13.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:59:13.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:59:13.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:59:13.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:59:13.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:59:13.742 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:13.741+0000 7f293fe598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:13.751 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:13.753+0000 7f293fe598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:13.753 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:13.753+0000 7f293fe598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:13.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:13.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:14.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:14.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:14.961+0000 7f293fe598c0 -1 Falling back to public interface 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:15.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:15.961 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:15.961+0000 7f293fe598c0 -1 osd.3 42 log_to_monitors true 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:16.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:16.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:17.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 46 up_thru 46 down_at 43 last_clean_interval [41,42) [v2:127.0.0.1:6826/1462374031,v1:127.0.0.1:6827/1462374031] [v2:127.0.0.1:6828/1462374031,v1:127.0.0.1:6829/1462374031] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:17.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:17.596 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:17.596 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:17.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:17.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:17.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:17.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836495 2026-03-08T22:59:17.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836495 2026-03-08T22:59:17.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495' 2026-03-08T22:59:17.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:17.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T22:59:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T22:59:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974' 2026-03-08T22:59:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:17.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:18.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509452 2026-03-08T22:59:18.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509452 2026-03-08T22:59:18.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509452' 2026-03-08T22:59:18.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:18.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=197568495618 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 197568495618 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509452 3-197568495618' 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836495 2026-03-08T22:59:18.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:18.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836495 2026-03-08T22:59:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:18.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836495 2026-03-08T22:59:18.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836495' 2026-03-08T22:59:18.135 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836495 2026-03-08T22:59:18.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:18.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836495 2026-03-08T22:59:18.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:18.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T22:59:18.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:18.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:18.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T22:59:18.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:18.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T22:59:18.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T22:59:18.314 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672974 2026-03-08T22:59:18.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:18.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672974 -lt 42949672974 2026-03-08T22:59:18.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:18.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509452 2026-03-08T22:59:18.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:18.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:18.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509452 2026-03-08T22:59:18.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:18.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509452 2026-03-08T22:59:18.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509452' 2026-03-08T22:59:18.482 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509452 2026-03-08T22:59:18.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:18.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509452 -lt 64424509452 2026-03-08T22:59:18.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:18.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-197568495618 2026-03-08T22:59:18.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:18.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:18.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-197568495618 2026-03-08T22:59:18.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:18.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=197568495618 2026-03-08T22:59:18.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 197568495618' 2026-03-08T22:59:18.659 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 197568495618 2026-03-08T22:59:18.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:18.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 197568495618 -lt 197568495618 2026-03-08T22:59:18.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:18.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:18.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:19.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:19.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:19.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:19.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:19.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:19.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:19.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:19.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:19.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:19.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:19.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:19.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:19.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:19.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:19.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:19.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:59:19.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:261: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 1 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=1 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:59:19.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:19.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:59:20.651 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:59:21.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:21.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:59:21.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:59:21.183 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T22:59:21.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:21.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:59:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:59:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:59:21.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:59:21.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:59:21.199 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:21.197+0000 7f24f9b168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.202 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:21.205+0000 7f24f9b168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.204 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:21.205+0000 7f24f9b168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:21.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:21.363 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:59:21.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:21.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:21.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:22.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:22.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:22.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:22.531 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:22.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:22.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:22.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:22.645+0000 7f24f9b168c0 -1 Falling back to public interface 2026-03-08T22:59:22.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:23.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:23.865 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:23.865+0000 7f24f9b168c0 -1 osd.1 47 log_to_monitors true 2026-03-08T22:59:23.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:24.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:24.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:24.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:24.885 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:59:24.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:24.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 51 up_thru 51 down_at 48 last_clean_interval [10,47) [v2:127.0.0.1:6810/1730889708,v1:127.0.0.1:6811/1730889708] [v2:127.0.0.1:6812/1730889708,v1:127.0.0.1:6813/1730889708] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:25.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:25.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:25.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:25.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:25.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:25.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836498 2026-03-08T22:59:25.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836498 2026-03-08T22:59:25.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498' 2026-03-08T22:59:25.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:25.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:25.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332098 2026-03-08T22:59:25.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332098 2026-03-08T22:59:25.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-219043332098' 2026-03-08T22:59:25.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:25.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:25.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509455 2026-03-08T22:59:25.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509455 2026-03-08T22:59:25.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-219043332098 2-64424509455' 2026-03-08T22:59:25.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:25.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:25.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=197568495620 2026-03-08T22:59:25.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 197568495620 2026-03-08T22:59:25.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-219043332098 2-64424509455 3-197568495620' 2026-03-08T22:59:25.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:25.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836498 2026-03-08T22:59:25.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:25.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:25.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836498 2026-03-08T22:59:25.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:25.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836498 2026-03-08T22:59:25.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836498' 2026-03-08T22:59:25.649 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836498 2026-03-08T22:59:25.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836498 2026-03-08T22:59:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:26.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:27.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836498 -lt 21474836498 2026-03-08T22:59:27.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:27.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332098 2026-03-08T22:59:27.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:27.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:27.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332098 2026-03-08T22:59:27.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:27.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332098 2026-03-08T22:59:27.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332098' 2026-03-08T22:59:27.004 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332098 2026-03-08T22:59:27.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:27.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332098 -lt 219043332098 2026-03-08T22:59:27.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:27.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509455 2026-03-08T22:59:27.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:27.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:27.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509455 2026-03-08T22:59:27.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:27.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509455 2026-03-08T22:59:27.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509455' 2026-03-08T22:59:27.198 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509455 2026-03-08T22:59:27.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:27.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509455 -lt 64424509455 2026-03-08T22:59:27.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:27.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-197568495620 2026-03-08T22:59:27.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:27.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:27.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-197568495620 2026-03-08T22:59:27.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=197568495620 2026-03-08T22:59:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 197568495620' 2026-03-08T22:59:27.383 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 197568495620 2026-03-08T22:59:27.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:27.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 197568495621 -lt 197568495620 2026-03-08T22:59:27.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:27.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:27.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:27.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:27.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:27.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:27.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:27.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:27.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:27.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:27.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:27.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:27.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:27.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:27.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:28.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:28.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:28.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:28.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T22:59:28.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:59:28.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:59:28.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:59:28.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:59:28.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T22:59:28.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:28.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:28.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:28.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:59:28.595 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:59:28.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:28.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:28.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:28.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:28.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:28.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:28.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:07.396604+0000 '>' 2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:28.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:29.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:29.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:07.396604+0000 '>' 2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:29.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:30.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:30.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:30.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:30.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:30.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:30.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:30.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:07.396604+0000 '>' 2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:32.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:32.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:32.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:32.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:32.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:32.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:32.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:07.396604+0000 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:32.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T22:59:32.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:32.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:32.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:32.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:32.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:59:32.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T22:59:32.946 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T22:59:32.946 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T22:59:32.946 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T22:59:32.946 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T22:59:33.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:33.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:33.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T22:59:33.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T22:59:33.229 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T22:59:33.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:33.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T22:59:33.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T22:59:33.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:59:33.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:59:33.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:59:33.249 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:33.249+0000 7f1082c9c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:33.251 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:33.253+0000 7f1082c9c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:33.258 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:33.257+0000 7f1082c9c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:33.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:33.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:34.203 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:34.205+0000 7f1082c9c8c0 -1 Falling back to public interface 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:34.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:34.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:35.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T22:59:35.197+0000 7f1082c9c8c0 -1 osd.3 52 log_to_monitors true 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:35.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:35.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:36.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 56 up_thru 56 down_at 53 last_clean_interval [46,52) [v2:127.0.0.1:6826/3542933226,v1:127.0.0.1:6827/3542933226] [v2:127.0.0.1:6828/3542933226,v1:127.0.0.1:6829/3542933226] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:37.160 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:37.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:37.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:37.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:37.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:37.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:37.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836501 2026-03-08T22:59:37.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836501 2026-03-08T22:59:37.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501' 2026-03-08T22:59:37.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:37.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:37.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332101 2026-03-08T22:59:37.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332101 2026-03-08T22:59:37.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-219043332101' 2026-03-08T22:59:37.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:37.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:37.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509458 2026-03-08T22:59:37.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509458 2026-03-08T22:59:37.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-219043332101 2-64424509458' 2026-03-08T22:59:37.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:37.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:37.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168578 2026-03-08T22:59:37.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168578 2026-03-08T22:59:37.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-219043332101 2-64424509458 3-240518168578' 2026-03-08T22:59:37.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:37.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836501 2026-03-08T22:59:37.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:37.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:37.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836501 2026-03-08T22:59:37.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:37.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836501 2026-03-08T22:59:37.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836501' 2026-03-08T22:59:37.737 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836501 2026-03-08T22:59:37.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:37.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836501 2026-03-08T22:59:37.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:38.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:38.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:39.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836501 -lt 21474836501 2026-03-08T22:59:39.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:39.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332101 2026-03-08T22:59:39.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:39.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:39.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332101 2026-03-08T22:59:39.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:39.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332101 2026-03-08T22:59:39.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332101' 2026-03-08T22:59:39.082 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332101 2026-03-08T22:59:39.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:39.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332101 -lt 219043332101 2026-03-08T22:59:39.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:39.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509458 2026-03-08T22:59:39.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:39.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:39.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509458 2026-03-08T22:59:39.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509458 2026-03-08T22:59:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509458' 2026-03-08T22:59:39.261 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509458 2026-03-08T22:59:39.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:39.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509458 -lt 64424509458 2026-03-08T22:59:39.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:39.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168578 2026-03-08T22:59:39.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:39.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:39.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168578 2026-03-08T22:59:39.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:39.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168578 2026-03-08T22:59:39.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168578' 2026-03-08T22:59:39.448 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 240518168578 2026-03-08T22:59:39.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:39.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168578 -lt 240518168578 2026-03-08T22:59:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:39.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:39.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:40.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:40.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:40.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:40.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:40.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:40.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:40.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:40.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T22:59:40.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:262: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 1 2 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=1 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=2 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 151397"' 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 151397' 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 151399"' 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 151399' 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 151397 151399' 2026-03-08T22:59:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 151397 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/151400: /' 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/151402: /' 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:41.755 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:41.757 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:41.758 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:41.758 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:59:41.758 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:59:41.758 INFO:tasks.workunit.client.0.vm03.stderr:151400: start osd.1 2026-03-08T22:59:41.758 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:59:41.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING remove 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: remove 2#2:eb822e21:::SOMETHING:head# 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:41.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:41.781 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:41.782 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:41.782 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:59:41.782 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:59:41.782 INFO:tasks.workunit.client.0.vm03.stderr:151402: start osd.2 2026-03-08T22:59:41.782 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2026-03-08T22:59:41.781+0000 7f6efeea48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2026-03-08T22:59:41.789+0000 7f6efeea48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2026-03-08T22:59:41.801+0000 7f6efeea48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 0 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2026-03-08T22:59:43.009+0000 7f6efeea48c0 -1 Falling back to public interface 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.727 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2026-03-08T22:59:43.765+0000 7f6efeea48c0 -1 osd.1 58 log_to_monitors true 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: 3 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.728 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2026-03-08T22:59:41.817+0000 7f5acb6058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2026-03-08T22:59:41.833+0000 7f5acb6058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2026-03-08T22:59:41.841+0000 7f5acb6058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: 0 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.767 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2026-03-08T22:59:43.025+0000 7f5acb6058c0 -1 Falling back to public interface 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: 1 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2026-03-08T22:59:44.109+0000 7f5acb6058c0 -1 osd.2 58 log_to_monitors true 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: 3 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:45.768 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:47.270 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: 4 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: osd.1 up in weight 1 up_from 63 up_thru 65 down_at 59 last_clean_interval [51,58) [v2:127.0.0.1:6810/4083218809,v1:127.0.0.1:6811/4083218809] [v2:127.0.0.1:6812/4083218809,v1:127.0.0.1:6813/4083218809] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: 1 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: 2 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: 3' 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:47.271 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836504 2026-03-08T22:59:47.272 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836504 2026-03-08T22:59:47.272 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504' 2026-03-08T22:59:47.272 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-hels.sh:985: wait_for_osd: sleep 1 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: 4 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: osd.2 up in weight 1 up_from 65 up_thru 48 down_at 59 last_clean_interval [15,58) [v2:127.0.0.1:6818/4027731905,v1:127.0.0.1:6819/4027731905] [v2:127.0.0.1:6820/4027731905,v1:127.0.0.1:6821/4027731905] exists,up 79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T22:59:47.329 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: 1 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: 2 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: 3' 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T22:59:47.330 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939650 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939650 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-270582939650' 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874242 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874242 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-270582939650 2-279172874242' 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168581 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168581 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836504 1-270582939650 2-279172874242 3-240518168581' 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836504' 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: waiting osd.0 seq 21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836503 -lt 21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836504 2026-03-08T22:59:48.911 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:48.912 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:48.912 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-270582939650 2026-03-08T22:59:48.912 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:48.912 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:48.912 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-270582939650 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/upers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939651 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939651 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-270582939651' 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874243 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874243 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-270582939651 2-279172874243' 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168582 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168582 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-270582939651 2-279172874243 3-240518168582' 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: waiting osd.0 seq 21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836503 -lt 21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T22:59:48.993 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:48.994 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:48.994 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-270582939651 2026-03-08T22:59:48.994 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T22:59:48.994 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:48.994 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-270582939651 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939650 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 270582939650' 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: waiting osd.1 seq 270582939650 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939651 -lt 270582939650 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-279172874242 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-279172874242 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874242 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 279172874242' 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: waiting osd.2 seq 279172874242 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874243 -lt 279172874242 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:49.724 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168581 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168581 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168581 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168581' 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: waiting osd.3 seq 240518168581 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168582 -lt 240518168581 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:49.725 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("buntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939651 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 270582939651' 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: waiting osd.1 seq 270582939651 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939651 -lt 270582939651 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-279172874243 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-279172874243 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874243 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 279172874243' 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: waiting osd.2 seq 279172874243 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874243 -lt 279172874243 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168582 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168582 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168582 2026-03-08T22:59:49.761 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168582' 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: waiting osd.3 seq 240518168582 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168582 -lt 240518168582 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T22:59:49.762 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:151400: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:59:50.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 151399 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:stale") | not)' 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T22:59:50.177 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:151402: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:151402: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T22:59:50.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:50.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:50.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:50.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T22:59:50.695 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:50.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:50.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:52.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:52.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:53.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:53.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:53.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:53.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:53.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:53.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:53.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:53.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:53.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:54.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:54.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:54.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:55.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:55.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:29.017754+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:55.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T22:59:56.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T22:59:56.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T22:59:56.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T22:59:56.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T22:59:56.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T22:59:56.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T22:59:56.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:51.114908+0000 '>' 2026-03-08T22:59:29.017754+0000 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:59:56.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 154326"' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 154326' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 154327"' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 154327' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 154326 154327' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 154326 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/154329: /' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/154331: /' 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:59:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T22:59:57.503 INFO:tasks.workunit.client.0.vm03.stderr:154329: _ 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: hinfo_key 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: snapset 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:57.504 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T22:59:57.506 INFO:tasks.workunit.client.0.vm03.stderr:154329: start osd.1 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_rec154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING list-attrs 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING list-attrs 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: _ 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: hinfo_key 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: snapset 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T22:59:57.518 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T22:59:57.530 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T22:59:57.530 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: start osd.2 2026-03-08T22:59:57.531 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460overy_ops 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2026-03-08T22:59:57.529+0000 7f91a32a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2026-03-08T22:59:57.537+0000 7f91a32a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2026-03-08T22:59:57.549+0000 7f91a32a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 0 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2026-03-08T22:59:58.509+0000 7f91a32a68c0 -1 Falling back to public interface 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2026-03-08T22:59:59.521+0000 7f91a32a68c0 -1 osd.1 66 log_to_monitors true 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.522 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:01.523 INFO:tasks.workunit.client.0.vm03.stderr:154329: 3 2026-03-08T23:00:01.523 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.523 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/c --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2026-03-08T22:59:57.561+0000 7fa15b36a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2026-03-08T22:59:57.561+0000 7fa15b36a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2026-03-08T22:59:57.565+0000 7fa15b36a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 0 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2026-03-08T22:59:58.537+0000 7fa15b36a8c0 -1 Falling back to public interface 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2026-03-08T22:59:59.541+0000 7fa15b36a8c0 -1 osd.2 66 log_to_monitors true 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:01.586 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:01.587 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:01.587 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:01.587 INFO:tasks.workunit.client.0.vm03.stderr:154331: 3 2026-03-08T23:00:01.587 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:01.587 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:03.056 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:03.056 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:03.056 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:03.056 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: 4 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: osd.1 up in weight 1 up_from 70 up_thru 70 down_at 67 last_clean_interval [63,66) [v2:127.0.0.1:6810/2771808559,v1:127.0.0.1:6811/2771808559] [v2:127.0.0.1:6812/2771808559,v1:127.0.0.1:6813/2771808559] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: 1 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: 2 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: 3' 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836509 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836509 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509' 2026-03-08T23:00:03.057 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:03.154 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standaloneph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: 4 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: osd.2 up in weight 1 up_from 70 up_thru 48 down_at 67 last_clean_interval [65,66) [v2:127.0.0.1:6818/2558455513,v1:127.0.0.1:6819/2558455513] [v2:127.0.0.1:6820/2558455513,v1:127.0.0.1:6821/2558455513] exists,up 79fcfabc-4e85-41f1-b208-de561eec9f60 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: 1 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: 2 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: 3' 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836510 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836510 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510' 2026-03-08T23:00:03.155 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710722 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710722 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-300647710722' 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710722 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710722 2026-03-08T23:00:04.765 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-300647710722 2-300647710722' 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168586 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168586 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836509 1-300647710722 2-300647710722 3-240518168586' 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836509' 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: waiting osd.0 seq 21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836509 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-300647710722 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:04.766 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-300647710722 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:15432e/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710723 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710723 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-300647710723' 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710723 2026-03-08T23:00:04.829 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710723 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-300647710723 2-300647710723' 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=240518168587 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 240518168587 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836510 1-300647710723 2-300647710723 3-240518168587' 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836510' 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: waiting osd.0 seq 21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836510 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-300647710723 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:04.830 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-300647710723 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154339: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 300647710722' 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: waiting osd.1 seq 300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710723 -lt 300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710722' 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: waiting osd.2 seq 300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710723 -lt 300647710722 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168586 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:05.578 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168586 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168586 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168586' 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: waiting osd.3 seq 240518168586 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168587 -lt 240518168586 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:05.579 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(1: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 300647710723' 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: waiting osd.1 seq 300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710723 -lt 300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710723 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710723' 2026-03-08T23:00:05.659 INFO:tasks.workunit.client.0.vm03.stderr:154331: waiting osd.2 seq 300647710723 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710723 -lt 300647710723 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-240518168587 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-240518168587 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=240518168587 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 240518168587' 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: waiting osd.3 seq 240518168587 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 240518168587 -lt 240518168587 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:05.660 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:06.018 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:06.019 INFO:tasks.workunit.client.0.vm03.stderr:154329: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:06.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:06.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:00:06.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 154327 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:contains("stale") | not)' 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:06.066 INFO:tasks.workunit.client.0.vm03.stderr:154331: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:00:06.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:00:06.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:263: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 3 1 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=3 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=1 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 156690"' 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 156690' 2026-03-08T23:00:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 156691"' 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 156691' 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 156690 156691' 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 156690 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/156693: /' 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/156695: /' 2026-03-08T23:00:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:00:06.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:00:07.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:07.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:07.411 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:07.411 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: start osd.3 2026-03-08T23:00:07.412 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:00:07.606 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:07.607 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: start osd.1 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:00:07.610 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/o156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2026-03-08T23:00:07.429+0000 7f0199e2e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2026-03-08T23:00:07.429+0000 7f0199e2e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2026-03-08T23:00:07.433+0000 7f0199e2e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 0 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2026-03-08T23:00:08.401+0000 7f0199e2e8c0 -1 Falling back to public interface 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 1 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2026-03-08T23:00:09.385+0000 7f0199e2e8c0 -1 osd.3 72 log_to_monitors true 2026-03-08T23:00:11.409 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: 3 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.410 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156695: osd.3 up in weight 1 up_from 77 up_thru 77 down_at 73 last_sd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2026-03-08T23:00:07.645+0000 7f73154978c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2026-03-08T23:00:07.653+0000 7f73154978c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2026-03-08T23:00:07.657+0000 7f73154978c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: 0 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2026-03-08T23:00:08.869+0000 7f73154978c0 -1 Falling back to public interface 2026-03-08T23:00:11.692 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: 1 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2026-03-08T23:00:10.181+0000 7f73154978c0 -1 osd.1 72 log_to_monitors true 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: 3 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:11.693 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:11.891 INFO:tasks.workunit.client.0.vm03.stderr:156693: osd.1 up in weight 1 up_from 80 up_thru 80 down_at 73 last_clean_interval [56,72) [v2:127.0.0.1:6810/2950836416,v1:127.0.0.1:6811/2950836416] [v2:127.0.0.1:6812/2950836416,v1:127.0.0.1:6813/2950836416] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: 1 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: 2 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: 3' 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836513 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836513 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513' 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=343597383682 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 343597383682 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-343597383682' 2026-03-08T23:00:11.892 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph clean_interval [70,72) [v2:127.0.0.1:6826/280997455,v1:127.0.0.1:6827/280997455] [v2:127.0.0.1:6828/280997455,v1:127.0.0.1:6829/280997455] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:12.183 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: 1 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: 2 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: 3' 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836514 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836514 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514' 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=343597383683 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 343597383683 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-343597383683' 2026-03-08T23:00:12.184 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710727 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710727 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-343597383683 2-300647710727' 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481795 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481795 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836514 1-343597383683 2-300647710727 3-330712481795' 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836514 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836514 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836514 2026-03-08T23:00:12.754 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836514' 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: waiting osd.0 seq 21474836514 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836514 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-343597383683 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-343597383683 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=343597383683 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 343597383683' 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: waiting osd.1 seq 343597383683 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 343597383683 -lt 343597383683 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710727 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:12.755 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710727 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtesttell osd.2 flush_pg_stats 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710726 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710726 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-343597383682 2-300647710726' 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481794 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481794 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-343597383682 2-300647710726 3-330712481794' 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836513' 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: waiting osd.0 seq 21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836513 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-343597383682 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-343597383682 2026-03-08T23:00:13.679 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=343597383682 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 343597383682' 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: waiting osd.1 seq 343597383682 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 343597383683 -lt 343597383682 2026-03-08T23:00:13.680 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710726 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710726 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710726 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710726' 2026-03-08T23:00:14.491 INFO:tasks.workunit.client.0.vm03.stderr:156695: waiting osd.2 seq 300647710726 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710726 -lt 300647710726 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-330712481794 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-330712481794 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481794 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 330712481794' 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: waiting osd.3 seq 330712481794 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481794 -lt 330712481794 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:14.492 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:14.701 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:14.701 INFO:tasks.workunit.client.0.vm03.stderr:156695: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:156695: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:00:14.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 156691 2026-03-08T23:00:15.716 INFO:tasks.workunit.client.0.vm03.stderr:/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710727 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710727' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: waiting osd.2 seq 300647710727 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710726 -lt 300647710727 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710726 -lt 300647710727 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710727 -lt 300647710727 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-330712481795 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-330712481795 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481795 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 330712481795' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: waiting osd.3 seq 330712481795 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481795 -lt 330712481795 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:15.717 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:156693: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:00:16.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:00:16.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:00:16.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:00:16.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:00:16.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:16.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:00:16.654 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:00:16.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:00:16.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:00:16.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:51.114908+0000 '>' 2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:16.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:00:17.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:00:17.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:00:17.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:00:17.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:00:17.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:00:17.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:00:17.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:00:18.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:51.114908+0000 '>' 2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:18.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:00:19.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:00:19.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T22:59:51.114908+0000 '>' 2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:19.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:00:20.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:00:20.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:00:20.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:00:20.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:00:20.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:00:20.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:00:20.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:00:20.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:00:17.422526+0000 '>' 2026-03-08T22:59:51.114908+0000 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 159543"' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 159543' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 159545"' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 159545' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 159543 159545' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 159543 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/159546: /' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/159548: /' 2026-03-08T23:00:20.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:00:20.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:00:21.130 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: _ 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: hinfo_key 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: snapset 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:00:21.131 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:21.132 INFO:tasks.workunit.client.0.vm03.stderr:159546: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:00:21.133 INFO:tasks.workunit.client.0.vm03.stderr:159546: start osd.3 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:00:21.150 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: _ 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: hinfo_key 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: snapset 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:21.151 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:00:21.159 INFO:tasks.workunit.client.0.vm03.stderr:159548: start osd.1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_reclen=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2026-03-08T23:00:21.153+0000 7f45376528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2026-03-08T23:00:21.165+0000 7f45376528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2026-03-08T23:00:21.173+0000 7f45376528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 0 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2026-03-08T23:00:22.129+0000 7f45376528c0 -1 Falling back to public interface 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2026-03-08T23:00:23.421+0000 7f45376528c0 -1 osd.3 81 log_to_monitors true 2026-03-08T23:00:24.977 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: 3 2026-03-08T23:00:24.978 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.289 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helperovery_ops 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2026-03-08T23:00:21.193+0000 7fb26efb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2026-03-08T23:00:21.205+0000 7fb26efb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2026-03-08T23:00:21.213+0000 7fb26efb78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 0 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2026-03-08T23:00:22.165+0000 7fb26efb78c0 -1 Falling back to public interface 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2026-03-08T23:00:23.429+0000 7fb26efb78c0 -1 osd.1 81 log_to_monitors true 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:25.290 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: 3 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:25.291 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: 4 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: osd.1 up in weight 1 up_from 85 up_thru 85 down_at 82 last_clean_interval [80,81) [v2:127.0.0.1:6826/2695316585,v1:127.0.0.1:6827/2695316585] [v2:127.0.0.1:6828/2695316585,v1:127.0.0.1:6829/2695316585] exists,up 5ef39c9d-4d00-4ccf-aa53-af311ceb2231 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: 1 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: 2 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: 3' 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836518 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836518 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518' 2026-03-08T23:00:26.859 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:26.870 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalons.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: 4 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: osd.3 up in weight 1 up_from 85 up_thru 85 down_at 82 last_clean_interval [77,81) [v2:127.0.0.1:6810/1664142205,v1:127.0.0.1:6811/1664142205] [v2:127.0.0.1:6812/1664142205,v1:127.0.0.1:6813/1664142205] exists,up 15f78b88-fe52-405a-a6d1-07f48fd457b8 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: 1 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: 2 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: 3' 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:00:26.871 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:26.872 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:00:26.872 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836519 2026-03-08T23:00:26.872 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836519 2026-03-08T23:00:26.872 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519' 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helperse/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=365072220162 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 365072220162 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-365072220162' 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710731 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710731 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-365072220162 2-300647710731' 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=365072220162 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 365072220162 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836518 1-365072220162 2-300647710731 3-365072220162' 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836518 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836518 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836518 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836518' 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: waiting osd.0 seq 21474836518 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:28.551 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836518 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836519 -lt 21474836518 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-365072220162 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:28.552 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-365072220162 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:15954.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=365072220163 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 365072220163 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-365072220163' 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710732 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710732 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-365072220163 2-300647710732' 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=365072220163 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 365072220163 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-365072220163 2-300647710732 3-365072220163' 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836519' 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: waiting osd.0 seq 21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836519 -lt 21474836519 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-365072220163 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:00:28.584 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-365072220163 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=365072220163 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 365072220163' 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: waiting osd.1 seq 365072220163 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 365072220163 -lt 365072220163 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710732 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710732 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710732 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710732' 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: waiting osd.2 seq 300647710732 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710732 -lt 300647710732 2026-03-08T23:00:29.403 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-365072220163 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-365072220163 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=365072220163 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 365072220163' 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: waiting osd.3 seq 365072220163 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 365072220163 -lt 365072220163 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:29.404 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/ce8: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 365072220162' 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: waiting osd.1 seq 365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 365072220163 -lt 365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-300647710731 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-300647710731 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710731 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 300647710731' 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: waiting osd.2 seq 300647710731 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710732 -lt 300647710731 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 365072220162' 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: waiting osd.3 seq 365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 365072220163 -lt 365072220162 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:00:29.444 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(phtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:159546: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:00:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 159545 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:contains("stale") | not)' 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:159548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:00:29.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:00:29.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:00:29.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:00:29.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:00:29.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:00:29.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:00:29.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:00:29.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:00:29.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:29.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:29.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:29.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:29.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:30.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:30.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:00:30.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:00:30.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:00:30.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:00:30.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:00:30.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:00:30.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:30.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:00:30.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:00:30.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:30.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:00:30.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:00:30.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:00:30.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:00:30.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:00:30.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:00:30.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:00:30.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:00:30.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:00:30.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:00:30.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:00:30.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:00:30.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:00:30.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:00:30.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:00:30.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:30.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:00:30.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:00:30.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:00:30.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:00:30.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:00:30.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:00:30.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:00:30.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:00:30.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:00:30.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:00:30.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:00:30.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:00:30.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:00:30.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_and_repair_lrc_appends td/osd-scrub-repair 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:842: TEST_corrupt_and_repair_lrc_appends: corrupt_and_repair_lrc td/osd-scrub-repair false 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:825: corrupt_and_repair_lrc: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:826: corrupt_and_repair_lrc: local allow_overwrites=false 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:827: corrupt_and_repair_lrc: local poolname=ecpool 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:829: corrupt_and_repair_lrc: run_mon td/osd-scrub-repair a 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:00:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:30.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:00:30.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:00:30.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:00:30.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:00:30.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:00:30.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:00:30.346 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:00:30.347 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:00:30.347 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:00:30.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:00:30.348 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:30.348 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.348 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:00:30.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:00:30.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:00:30.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:00:30.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:00:30.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:00:30.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.429 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:00:30.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:00:30.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:830: corrupt_and_repair_lrc: run_mgr td/osd-scrub-repair x 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:00:30.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.619 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:30.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:00:30.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:00:30.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: seq 0 9 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 0 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:30.644 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:30.645 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:30.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:30.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:30.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:30.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:30.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:00:30.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=30c33fe1-dc7a-45a6-9858-1313957652b7 2026-03-08T23:00:30.651 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 30c33fe1-dc7a-45a6-9858-1313957652b7 2026-03-08T23:00:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 30c33fe1-dc7a-45a6-9858-1313957652b7' 2026-03-08T23:00:30.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:30.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCO/61pXqnvJxAALtgN3Q5gUWZrhic1v7cLlg== 2026-03-08T23:00:30.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCO/61pXqnvJxAALtgN3Q5gUWZrhic1v7cLlg=="}' 2026-03-08T23:00:30.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 30c33fe1-dc7a-45a6-9858-1313957652b7 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:00:30.774 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:00:30.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:00:30.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCO/61pXqnvJxAALtgN3Q5gUWZrhic1v7cLlg== --osd-uuid 30c33fe1-dc7a-45a6-9858-1313957652b7 2026-03-08T23:00:30.807 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:30.810+0000 7f21438518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:30.814 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:30.818+0000 7f21438518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:30.816 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:30.818+0000 7f21438518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:30.816 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:30.818+0000 7f21438518c0 -1 bdev(0x55821cdc8c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:30.816 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:30.818+0000 7f21438518c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:00:33.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:00:33.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:33.095 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:00:33.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:00:33.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:33.214 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:00:33.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:00:33.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:33.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:33.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:33.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:33.257 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:33.250+0000 7ff8177418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:33.274 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:33.278+0000 7ff8177418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:33.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:33.286+0000 7ff8177418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:33.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:33.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:33.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:33.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:34.235 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:34.238+0000 7ff8177418c0 -1 Falling back to public interface 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:34.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:34.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:35.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:35.210+0000 7ff8177418c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:35.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:35.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:36.374 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:36.378+0000 7ff812efa640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:36.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:00:37.050 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3799029828,v1:127.0.0.1:6803/3799029828] [v2:127.0.0.1:6804/3799029828,v1:127.0.0.1:6805/3799029828] exists,up 30c33fe1-dc7a-45a6-9858-1313957652b7 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 1 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:37.051 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:37.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:00:37.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:37.054 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 21636514-d625-405f-ab7f-b8c02cb8a73c 2026-03-08T23:00:37.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=21636514-d625-405f-ab7f-b8c02cb8a73c 2026-03-08T23:00:37.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 21636514-d625-405f-ab7f-b8c02cb8a73c' 2026-03-08T23:00:37.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:37.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCV/61pSRJQBBAABBDLsH9jrOgXU8WvR2PozA== 2026-03-08T23:00:37.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCV/61pSRJQBBAABBDLsH9jrOgXU8WvR2PozA=="}' 2026-03-08T23:00:37.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 21636514-d625-405f-ab7f-b8c02cb8a73c -i td/osd-scrub-repair/1/new.json 2026-03-08T23:00:37.286 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:00:37.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:00:37.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCV/61pSRJQBBAABBDLsH9jrOgXU8WvR2PozA== --osd-uuid 21636514-d625-405f-ab7f-b8c02cb8a73c 2026-03-08T23:00:37.328 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:37.330+0000 7f41da1ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:37.331 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:37.334+0000 7f41da1ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:37.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:37.334+0000 7f41da1ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:37.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:37.334+0000 7f41da1ac8c0 -1 bdev(0x55bdd73c7c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:37.332 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:37.334+0000 7f41da1ac8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:00:39.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:00:39.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:39.851 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:00:39.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:00:39.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:40.070 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:00:40.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:00:40.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:40.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:40.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:40.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:40.086 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:40.090+0000 7f063b19e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:40.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:40.098+0000 7f063b19e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:40.095 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:40.098+0000 7f063b19e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:40.256 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:00:40.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:00:40.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:40.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:40.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:41.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:41.070+0000 7f063b19e8c0 -1 Falling back to public interface 2026-03-08T23:00:41.451 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:41.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:41.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:42.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:42.026+0000 7f063b19e8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:00:42.632 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:00:42.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:42.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:42.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:42.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:42.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:42.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:43.100 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:43.102+0000 7f0636957640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:00:43.837 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:00:43.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:43.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:43.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:43.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:43.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1393629078,v1:127.0.0.1:6811/1393629078] [v2:127.0.0.1:6812/1393629078,v1:127.0.0.1:6813/1393629078] exists,up 21636514-d625-405f-ab7f-b8c02cb8a73c 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 2 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:44.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:44.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:00:44.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:44.015 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 594a371a-8656-45b6-b1dd-736c038cd1d1 2026-03-08T23:00:44.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=594a371a-8656-45b6-b1dd-736c038cd1d1 2026-03-08T23:00:44.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 594a371a-8656-45b6-b1dd-736c038cd1d1' 2026-03-08T23:00:44.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:44.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCc/61p3jAQAhAAHN+Ausk2oqLmjPIPhKKeDg== 2026-03-08T23:00:44.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCc/61p3jAQAhAAHN+Ausk2oqLmjPIPhKKeDg=="}' 2026-03-08T23:00:44.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 594a371a-8656-45b6-b1dd-736c038cd1d1 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:00:44.209 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:00:44.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:00:44.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCc/61p3jAQAhAAHN+Ausk2oqLmjPIPhKKeDg== --osd-uuid 594a371a-8656-45b6-b1dd-736c038cd1d1 2026-03-08T23:00:44.242 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:44.246+0000 7f70cc5a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:44.244 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:44.246+0000 7f70cc5a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:44.245 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:44.250+0000 7f70cc5a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:44.246 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:44.250+0000 7f70cc5a68c0 -1 bdev(0x565414b5fc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:44.246 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:44.250+0000 7f70cc5a68c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:00:46.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:00:46.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:46.531 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:00:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:00:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:46.750 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:00:46.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:00:46.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:46.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:46.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:46.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:46.768 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:46.770+0000 7f9eb066e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:46.769 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:46.770+0000 7f9eb066e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:46.771 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:46.774+0000 7f9eb066e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:47.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:47.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:47.991 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:47.994+0000 7f9eb066e8c0 -1 Falling back to public interface 2026-03-08T23:00:48.212 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:00:48.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:48.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:48.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:48.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:48.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:48.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:48.740 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:48.742+0000 7f9eb066e8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:00:49.394 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:00:49.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:49.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:49.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:49.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:49.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:49.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:50.595 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:00:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:50.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:50.857 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:50.858+0000 7f9eabe27640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:51.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3041005457,v1:127.0.0.1:6819/3041005457] [v2:127.0.0.1:6820/3041005457,v1:127.0.0.1:6821/3041005457] exists,up 594a371a-8656-45b6-b1dd-736c038cd1d1 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 3 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:00:51.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:00:51.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:00:51.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:00:51.971 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:00:51.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:00:51.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 df112e4e-0f41-42b6-ad94-9c02d4399d65' 2026-03-08T23:00:51.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:00:51.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCj/61pOvcAOxAAl/UCmvwNIkTFVZqUaTZRfA== 2026-03-08T23:00:51.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCj/61pOvcAOxAAl/UCmvwNIkTFVZqUaTZRfA=="}' 2026-03-08T23:00:51.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new df112e4e-0f41-42b6-ad94-9c02d4399d65 -i td/osd-scrub-repair/3/new.json 2026-03-08T23:00:52.158 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:00:52.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T23:00:52.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCj/61pOvcAOxAAl/UCmvwNIkTFVZqUaTZRfA== --osd-uuid df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:00:52.193 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:52.198+0000 7f919b94a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:52.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:52.198+0000 7f919b94a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:52.196 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:52.198+0000 7f919b94a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:52.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:52.198+0000 7f919b94a8c0 -1 bdev(0x55f4ea031c00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:00:52.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:52.202+0000 7f919b94a8c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T23:00:54.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T23:00:54.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:00:54.792 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T23:00:54.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:00:54.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:00:55.019 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T23:00:55.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:00:55.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:00:55.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:00:55.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:00:55.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:00:55.040 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:55.042+0000 7fce6daec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:55.046 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:55.050+0000 7fce6daec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:55.049 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:55.050+0000 7fce6daec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:00:55.213 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:55.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:55.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:56.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:56.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:56.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:00:56.399 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:00:56.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:56.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:56.506 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:56.510+0000 7fce6daec8c0 -1 Falling back to public interface 2026-03-08T23:00:56.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:57.503 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:00:57.506+0000 7fce6daec8c0 -1 osd.3 0 log_to_monitors true 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:57.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:57.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:58.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:00:58.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:00:59.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3701643638,v1:127.0.0.1:6827/3701643638] [v2:127.0.0.1:6828/3701643638,v1:127.0.0.1:6829/3701643638] exists,up df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 4 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/4 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/4' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/4/journal' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:00.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:00.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:00.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/4 2026-03-08T23:01:00.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:00.155 INFO:tasks.workunit.client.0.vm03.stdout:add osd4 6f4da644-c6d1-4579-bfc2-ea04410dafb7 2026-03-08T23:01:00.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=6f4da644-c6d1-4579-bfc2-ea04410dafb7 2026-03-08T23:01:00.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 6f4da644-c6d1-4579-bfc2-ea04410dafb7' 2026-03-08T23:01:00.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:00.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCs/61pgENnChAAP71aLQtDy1B3k9opS2lBSQ== 2026-03-08T23:01:00.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCs/61pgENnChAAP71aLQtDy1B3k9opS2lBSQ=="}' 2026-03-08T23:01:00.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 6f4da644-c6d1-4579-bfc2-ea04410dafb7 -i td/osd-scrub-repair/4/new.json 2026-03-08T23:01:00.333 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:01:00.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/4/new.json 2026-03-08T23:01:00.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/4 --osd-journal=td/osd-scrub-repair/4/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCs/61pgENnChAAP71aLQtDy1B3k9opS2lBSQ== --osd-uuid 6f4da644-c6d1-4579-bfc2-ea04410dafb7 2026-03-08T23:01:00.370 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:00.374+0000 7f2adc5998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:00.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:00.374+0000 7f2adc5998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:00.373 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:00.378+0000 7f2adc5998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:00.374 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:00.378+0000 7f2adc5998c0 -1 bdev(0x561f8cb6dc00 td/osd-scrub-repair/4/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:00.374 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:00.378+0000 7f2adc5998c0 -1 bluestore(td/osd-scrub-repair/4) _read_fsid unparsable uuid 2026-03-08T23:01:03.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/4/keyring 2026-03-08T23:01:03.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:03.001 INFO:tasks.workunit.client.0.vm03.stdout:adding osd4 key to auth repository 2026-03-08T23:01:03.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T23:01:03.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:03.221 INFO:tasks.workunit.client.0.vm03.stdout:start osd.4 2026-03-08T23:01:03.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T23:01:03.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/4 --osd-journal=td/osd-scrub-repair/4/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:03.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:03.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:03.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:03.240 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:03.242+0000 7f26e5b4f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:03.241 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:03.246+0000 7f26e5b4f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:03.243 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:03.246+0000 7f26e5b4f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:03.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:01:03.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:03.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:03.962 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:03.966+0000 7f26e5b4f8c0 -1 Falling back to public interface 2026-03-08T23:01:04.614 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:04.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:04.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:04.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:04.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:04.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:01:04.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:04.932 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:04.934+0000 7f26e5b4f8c0 -1 osd.4 0 log_to_monitors true 2026-03-08T23:01:05.798 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:05.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:05.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:05.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:05.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:05.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:01:05.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:06.998 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:06.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:06.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:06.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:06.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:06.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stdout:osd.4 up in weight 1 up_from 25 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/321445712,v1:127.0.0.1:6835/321445712] [v2:127.0.0.1:6836/321445712,v1:127.0.0.1:6837/321445712] exists,up 6f4da644-c6d1-4579-bfc2-ea04410dafb7 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 5 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:07.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:07.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:01:07.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:07.195 INFO:tasks.workunit.client.0.vm03.stdout:add osd5 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:01:07.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:01:07.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 0c5fe5be-089f-445b-a798-38a5fe8df624' 2026-03-08T23:01:07.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:07.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCz/61pBCLFDBAA0vfEIneyadH6OHBSOHSbbg== 2026-03-08T23:01:07.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCz/61pBCLFDBAA0vfEIneyadH6OHBSOHSbbg=="}' 2026-03-08T23:01:07.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0c5fe5be-089f-445b-a798-38a5fe8df624 -i td/osd-scrub-repair/5/new.json 2026-03-08T23:01:07.375 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:01:07.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/5/new.json 2026-03-08T23:01:07.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCz/61pBCLFDBAA0vfEIneyadH6OHBSOHSbbg== --osd-uuid 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:01:07.408 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:07.410+0000 7fe8cd1148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:07.409 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:07.414+0000 7fe8cd1148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:07.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:07.414+0000 7fe8cd1148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:07.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:07.414+0000 7fe8cd1148c0 -1 bdev(0x55fa8ae6fc00 td/osd-scrub-repair/5/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:07.411 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:07.414+0000 7fe8cd1148c0 -1 bluestore(td/osd-scrub-repair/5) _read_fsid unparsable uuid 2026-03-08T23:01:09.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/5/keyring 2026-03-08T23:01:09.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:09.908 INFO:tasks.workunit.client.0.vm03.stdout:adding osd5 key to auth repository 2026-03-08T23:01:09.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T23:01:09.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:10.128 INFO:tasks.workunit.client.0.vm03.stdout:start osd.5 2026-03-08T23:01:10.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T23:01:10.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:10.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:10.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:10.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:10.148 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:10.150+0000 7f3d08aa98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:10.158+0000 7f3d08aa98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.157 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:10.158+0000 7f3d08aa98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:10.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:10.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:11.362 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:11.366+0000 7f3d08aa98c0 -1 Falling back to public interface 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:11.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:11.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:12.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:12.866 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:12.870+0000 7f3d08aa98c0 -1 osd.5 0 log_to_monitors true 2026-03-08T23:01:12.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:13.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:14.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:15.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:15.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:16.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:16.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:16.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 5 2026-03-08T23:01:16.273 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:01:16.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:01:16.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:16.462 INFO:tasks.workunit.client.0.vm03.stdout:osd.5 up in weight 1 up_from 30 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/1182240032,v1:127.0.0.1:6843/1182240032] [v2:127.0.0.1:6844/1182240032,v1:127.0.0.1:6845/1182240032] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 6 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/6 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/6' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/6/journal' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:16.463 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:16.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/6 2026-03-08T23:01:16.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:16.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=7c2aedff-fccf-4252-9da3-46a87bbd04d7 2026-03-08T23:01:16.467 INFO:tasks.workunit.client.0.vm03.stdout:add osd6 7c2aedff-fccf-4252-9da3-46a87bbd04d7 2026-03-08T23:01:16.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 7c2aedff-fccf-4252-9da3-46a87bbd04d7' 2026-03-08T23:01:16.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:16.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC8/61pAqkbHRAAPLfHkcFKv1FUKJbjok2Lgg== 2026-03-08T23:01:16.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC8/61pAqkbHRAAPLfHkcFKv1FUKJbjok2Lgg=="}' 2026-03-08T23:01:16.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 7c2aedff-fccf-4252-9da3-46a87bbd04d7 -i td/osd-scrub-repair/6/new.json 2026-03-08T23:01:16.660 INFO:tasks.workunit.client.0.vm03.stdout:6 2026-03-08T23:01:16.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/6/new.json 2026-03-08T23:01:16.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/6 --osd-journal=td/osd-scrub-repair/6/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC8/61pAqkbHRAAPLfHkcFKv1FUKJbjok2Lgg== --osd-uuid 7c2aedff-fccf-4252-9da3-46a87bbd04d7 2026-03-08T23:01:16.693 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:16.694+0000 7f1f95f6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:16.695 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:16.698+0000 7f1f95f6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:16.696 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:16.698+0000 7f1f95f6a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:16.696 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:16.698+0000 7f1f95f6a8c0 -1 bdev(0x55b2c0189c00 td/osd-scrub-repair/6/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:16.697 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:16.698+0000 7f1f95f6a8c0 -1 bluestore(td/osd-scrub-repair/6) _read_fsid unparsable uuid 2026-03-08T23:01:18.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/6/keyring 2026-03-08T23:01:18.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:18.955 INFO:tasks.workunit.client.0.vm03.stdout:adding osd6 key to auth repository 2026-03-08T23:01:18.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T23:01:18.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:19.169 INFO:tasks.workunit.client.0.vm03.stdout:start osd.6 2026-03-08T23:01:19.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T23:01:19.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/6 --osd-journal=td/osd-scrub-repair/6/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:19.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:19.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:19.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:19.185 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:19.186+0000 7fb2fa9d18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:19.186 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:19.190+0000 7fb2fa9d18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:19.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:19.190+0000 7fb2fa9d18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:01:19.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:19.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:20.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:01:20.634 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:20.638+0000 7fb2fa9d18c0 -1 Falling back to public interface 2026-03-08T23:01:20.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:21.683 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:21.686+0000 7fb2fa9d18c0 -1 osd.6 0 log_to_monitors true 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:21.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:01:21.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:22.902 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:22.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:22.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:22.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:22.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:22.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:01:23.072 INFO:tasks.workunit.client.0.vm03.stdout:osd.6 up in weight 1 up_from 35 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/2136186362,v1:127.0.0.1:6851/2136186362] [v2:127.0.0.1:6852/2136186362,v1:127.0.0.1:6853/2136186362] exists,up 7c2aedff-fccf-4252-9da3-46a87bbd04d7 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 7 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=7 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/7 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/7' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/7/journal' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/7 2026-03-08T23:01:23.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:23.076 INFO:tasks.workunit.client.0.vm03.stdout:add osd7 3fceaf6a-e5c9-4b24-8a59-9233cf5a6090 2026-03-08T23:01:23.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3fceaf6a-e5c9-4b24-8a59-9233cf5a6090 2026-03-08T23:01:23.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd7 3fceaf6a-e5c9-4b24-8a59-9233cf5a6090' 2026-03-08T23:01:23.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:23.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDD/61pHRCYBRAA71zG5RdanEbnguGKAaMkug== 2026-03-08T23:01:23.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDD/61pHRCYBRAA71zG5RdanEbnguGKAaMkug=="}' 2026-03-08T23:01:23.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3fceaf6a-e5c9-4b24-8a59-9233cf5a6090 -i td/osd-scrub-repair/7/new.json 2026-03-08T23:01:23.254 INFO:tasks.workunit.client.0.vm03.stdout:7 2026-03-08T23:01:23.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/7/new.json 2026-03-08T23:01:23.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 7 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/7 --osd-journal=td/osd-scrub-repair/7/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDD/61pHRCYBRAA71zG5RdanEbnguGKAaMkug== --osd-uuid 3fceaf6a-e5c9-4b24-8a59-9233cf5a6090 2026-03-08T23:01:23.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:23.286+0000 7f18b12f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.286 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:23.290+0000 7f18b12f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:23.290+0000 7f18b12f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:23.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:23.290+0000 7f18b12f48c0 -1 bdev(0x55fb28bbdc00 td/osd-scrub-repair/7/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:23.288 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:23.290+0000 7f18b12f48c0 -1 bluestore(td/osd-scrub-repair/7) _read_fsid unparsable uuid 2026-03-08T23:01:25.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/7/keyring 2026-03-08T23:01:25.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:25.703 INFO:tasks.workunit.client.0.vm03.stdout:adding osd7 key to auth repository 2026-03-08T23:01:25.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd7 key to auth repository 2026-03-08T23:01:25.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/7/keyring auth add osd.7 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:25.913 INFO:tasks.workunit.client.0.vm03.stdout:start osd.7 2026-03-08T23:01:25.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.7 2026-03-08T23:01:25.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 7 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/7 --osd-journal=td/osd-scrub-repair/7/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:25.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:25.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:25.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:25.933 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:25.934+0000 7fa85b6488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:25.938 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:25.942+0000 7fa85b6488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:25.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:25.942+0000 7fa85b6488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:26.107 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:26.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 7 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=7 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:26.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:01:26.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:27.150 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:27.154+0000 7fa85b6488c0 -1 Falling back to public interface 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:27.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:01:27.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:28.377 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:28.378+0000 7fa85b6488c0 -1 osd.7 0 log_to_monitors true 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:28.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:01:28.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:29.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:29.402+0000 7fa856e01640 -1 osd.7 0 waiting for initial osdmap 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:29.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stdout:osd.7 up in weight 1 up_from 40 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6858/795247047,v1:127.0.0.1:6859/795247047] [v2:127.0.0.1:6860/795247047,v1:127.0.0.1:6861/795247047] exists,up 3fceaf6a-e5c9-4b24-8a59-9233cf5a6090 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 8 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=8 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/8 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/8' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/8/journal' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:29.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:29.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/8 2026-03-08T23:01:29.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:29.851 INFO:tasks.workunit.client.0.vm03.stdout:add osd8 cae3905d-ac26-4682-919f-92ceddb28aad 2026-03-08T23:01:29.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=cae3905d-ac26-4682-919f-92ceddb28aad 2026-03-08T23:01:29.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd8 cae3905d-ac26-4682-919f-92ceddb28aad' 2026-03-08T23:01:29.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:29.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDJ/61pd5bKMxAAUYTLrwRZIj3JLBYGESNcXA== 2026-03-08T23:01:29.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDJ/61pd5bKMxAAUYTLrwRZIj3JLBYGESNcXA=="}' 2026-03-08T23:01:29.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new cae3905d-ac26-4682-919f-92ceddb28aad -i td/osd-scrub-repair/8/new.json 2026-03-08T23:01:30.024 INFO:tasks.workunit.client.0.vm03.stdout:8 2026-03-08T23:01:30.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/8/new.json 2026-03-08T23:01:30.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 8 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/8 --osd-journal=td/osd-scrub-repair/8/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDJ/61pd5bKMxAAUYTLrwRZIj3JLBYGESNcXA== --osd-uuid cae3905d-ac26-4682-919f-92ceddb28aad 2026-03-08T23:01:30.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:30.070+0000 7fea2dedb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:30.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:30.074+0000 7fea2dedb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:30.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:30.074+0000 7fea2dedb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:30.073 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:30.078+0000 7fea2dedb8c0 -1 bdev(0x56164d981c00 td/osd-scrub-repair/8/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:30.073 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:30.078+0000 7fea2dedb8c0 -1 bluestore(td/osd-scrub-repair/8) _read_fsid unparsable uuid 2026-03-08T23:01:33.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/8/keyring 2026-03-08T23:01:33.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:33.391 INFO:tasks.workunit.client.0.vm03.stdout:adding osd8 key to auth repository 2026-03-08T23:01:33.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd8 key to auth repository 2026-03-08T23:01:33.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/8/keyring auth add osd.8 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:33.603 INFO:tasks.workunit.client.0.vm03.stdout:start osd.8 2026-03-08T23:01:33.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.8 2026-03-08T23:01:33.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 8 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/8 --osd-journal=td/osd-scrub-repair/8/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:33.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:33.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:33.627 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:33.630+0000 7f2e84b2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:33.635 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:33.638+0000 7f2e84b2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:33.642 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:33.638+0000 7f2e84b2b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 8 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=8 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:33.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:01:34.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:34.842 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:34.846+0000 7f2e84b2b8c0 -1 Falling back to public interface 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:35.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:01:35.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:35.820 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:35.822+0000 7f2e84b2b8c0 -1 osd.8 0 log_to_monitors true 2026-03-08T23:01:36.266 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:36.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:36.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:36.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:36.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:36.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:01:36.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:37.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:01:37.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:38.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:01:38.855 INFO:tasks.workunit.client.0.vm03.stdout:osd.8 up in weight 1 up_from 45 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6866/2012422512,v1:127.0.0.1:6867/2012422512] [v2:127.0.0.1:6868/2012422512,v1:127.0.0.1:6869/2012422512] exists,up cae3905d-ac26-4682-919f-92ceddb28aad 2026-03-08T23:01:38.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:38.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 9 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=9 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:01:38.856 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:01:38.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:38.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:01:38.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:01:38.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:01:38.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:01:38.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:01:38.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:01:38.860 INFO:tasks.workunit.client.0.vm03.stdout:add osd9 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:01:38.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:01:38.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd9 481e3c8d-7f04-4b16-a1ed-4401e8c9a642' 2026-03-08T23:01:38.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:01:38.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDS/61pBSO6NBAAhRF6KqGLaBYo9AhonjzGhg== 2026-03-08T23:01:38.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDS/61pBSO6NBAAhRF6KqGLaBYo9AhonjzGhg=="}' 2026-03-08T23:01:38.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 -i td/osd-scrub-repair/9/new.json 2026-03-08T23:01:39.052 INFO:tasks.workunit.client.0.vm03.stdout:9 2026-03-08T23:01:39.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/9/new.json 2026-03-08T23:01:39.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDS/61pBSO6NBAAhRF6KqGLaBYo9AhonjzGhg== --osd-uuid 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:01:39.082 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:39.086+0000 7fd6ca9f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.084 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:39.086+0000 7fd6ca9f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:39.090+0000 7fd6ca9f48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:39.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:39.090+0000 7fd6ca9f48c0 -1 bdev(0x55f4801bbc00 td/osd-scrub-repair/9/block) open stat got: (1) Operation not permitted 2026-03-08T23:01:39.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:39.090+0000 7fd6ca9f48c0 -1 bluestore(td/osd-scrub-repair/9) _read_fsid unparsable uuid 2026-03-08T23:01:41.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/9/keyring 2026-03-08T23:01:41.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:01:41.355 INFO:tasks.workunit.client.0.vm03.stdout:adding osd9 key to auth repository 2026-03-08T23:01:41.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd9 key to auth repository 2026-03-08T23:01:41.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/9/keyring auth add osd.9 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:01:41.568 INFO:tasks.workunit.client.0.vm03.stdout:start osd.9 2026-03-08T23:01:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.9 2026-03-08T23:01:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:01:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:01:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:01:41.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:01:41.585 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:41.586+0000 7f04ce0ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:41.586 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:41.590+0000 7f04ce0ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:41.587 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:41.590+0000 7f04ce0ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:01:41.814 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 9 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:41.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:41.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:42.794 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:42.798+0000 7f04ce0ec8c0 -1 Falling back to public interface 2026-03-08T23:01:42.997 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:01:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:01:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:42.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:43.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:44.185 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:01:44.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:44.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:44.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:01:44.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:44.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:44.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:44.745 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:44.750+0000 7f04ce0ec8c0 -1 osd.9 0 log_to_monitors true 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:45.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:45.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:46.803 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:01:46.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:46.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:46.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:01:46.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:46.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:47.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:01:47.086 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:01:47.090+0000 7f04c98a5640 -1 osd.9 0 waiting for initial osdmap 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 5 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:01:48.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stdout:osd.9 up in weight 1 up_from 50 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6874/1787820626,v1:127.0.0.1:6875/1787820626] [v2:127.0.0.1:6876/1787820626,v1:127.0.0.1:6877/1787820626] exists,up 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:834: corrupt_and_repair_lrc: create_rbd_pool 2026-03-08T23:01:48.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:01:48.386 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:01:48.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:01:48.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:01:48.620 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:01:48.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:01:49.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:01:49.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:835: corrupt_and_repair_lrc: wait_for_clean 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:01:49.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:01:50.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:01:50.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T23:01:50.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T23:01:50.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T23:01:50.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:01:50.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672975 2026-03-08T23:01:50.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672975 2026-03-08T23:01:50.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975' 2026-03-08T23:01:50.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:01:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509453 2026-03-08T23:01:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509453 2026-03-08T23:01:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453' 2026-03-08T23:01:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:01:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345932 2026-03-08T23:01:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345932 2026-03-08T23:01:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932' 2026-03-08T23:01:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:01:50.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182410 2026-03-08T23:01:50.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182410 2026-03-08T23:01:50.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410' 2026-03-08T23:01:50.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:01:50.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018888 2026-03-08T23:01:50.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018888 2026-03-08T23:01:50.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410 5-128849018888' 2026-03-08T23:01:50.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:01:50.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855367 2026-03-08T23:01:50.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855367 2026-03-08T23:01:50.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410 5-128849018888 6-150323855367' 2026-03-08T23:01:50.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:01:50.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691846 2026-03-08T23:01:50.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691846 2026-03-08T23:01:50.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410 5-128849018888 6-150323855367 7-171798691846' 2026-03-08T23:01:50.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:01:50.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528324 2026-03-08T23:01:50.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528324 2026-03-08T23:01:50.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410 5-128849018888 6-150323855367 7-171798691846 8-193273528324' 2026-03-08T23:01:50.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:50.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364802 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364802 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-42949672975 2-64424509453 3-85899345932 4-107374182410 5-128849018888 6-150323855367 7-171798691846 8-193273528324 9-214748364802' 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T23:01:50.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:50.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:01:50.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T23:01:50.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:50.989 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836496 2026-03-08T23:01:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T23:01:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T23:01:50.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:51.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836496 2026-03-08T23:01:51.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:52.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:01:52.156 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:52.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T23:01:52.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:52.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672975 2026-03-08T23:01:52.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:52.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:01:52.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672975 2026-03-08T23:01:52.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:52.329 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672975 2026-03-08T23:01:52.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672975 2026-03-08T23:01:52.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672975' 2026-03-08T23:01:52.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:01:52.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672975 -lt 42949672975 2026-03-08T23:01:52.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:52.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509453 2026-03-08T23:01:52.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:52.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:01:52.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509453 2026-03-08T23:01:52.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:52.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509453 2026-03-08T23:01:52.519 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509453 2026-03-08T23:01:52.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509453' 2026-03-08T23:01:52.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:01:52.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509453 -lt 64424509453 2026-03-08T23:01:52.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:52.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345932 2026-03-08T23:01:52.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:52.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:01:52.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345932 2026-03-08T23:01:52.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:52.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345932 2026-03-08T23:01:52.709 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345932 2026-03-08T23:01:52.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345932' 2026-03-08T23:01:52.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:01:52.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345932 -lt 85899345932 2026-03-08T23:01:52.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:52.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182410 2026-03-08T23:01:52.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:52.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:01:52.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182410 2026-03-08T23:01:52.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:52.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182410 2026-03-08T23:01:52.901 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.4 seq 107374182410 2026-03-08T23:01:52.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182410' 2026-03-08T23:01:52.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:01:53.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182410 -lt 107374182410 2026-03-08T23:01:53.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:53.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018888 2026-03-08T23:01:53.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:53.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:01:53.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018888 2026-03-08T23:01:53.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:53.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018888 2026-03-08T23:01:53.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018888' 2026-03-08T23:01:53.104 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.5 seq 128849018888 2026-03-08T23:01:53.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:01:53.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018889 -lt 128849018888 2026-03-08T23:01:53.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:53.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855367 2026-03-08T23:01:53.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:53.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:01:53.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855367 2026-03-08T23:01:53.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:53.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855367 2026-03-08T23:01:53.280 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.6 seq 150323855367 2026-03-08T23:01:53.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855367' 2026-03-08T23:01:53.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:01:53.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855367 -lt 150323855367 2026-03-08T23:01:53.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:53.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691846 2026-03-08T23:01:53.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:53.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:01:53.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691846 2026-03-08T23:01:53.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:53.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691846 2026-03-08T23:01:53.461 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.7 seq 171798691846 2026-03-08T23:01:53.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691846' 2026-03-08T23:01:53.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:01:53.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691846 -lt 171798691846 2026-03-08T23:01:53.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:53.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528324 2026-03-08T23:01:53.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:53.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:01:53.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528324 2026-03-08T23:01:53.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:53.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528324 2026-03-08T23:01:53.650 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.8 seq 193273528324 2026-03-08T23:01:53.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528324' 2026-03-08T23:01:53.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:01:53.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528325 -lt 193273528324 2026-03-08T23:01:53.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:53.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364802 2026-03-08T23:01:53.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:53.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:01:53.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364802 2026-03-08T23:01:53.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:53.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364802 2026-03-08T23:01:53.840 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.9 seq 214748364802 2026-03-08T23:01:53.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364802' 2026-03-08T23:01:53.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:01:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364803 -lt 214748364802 2026-03-08T23:01:54.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:01:54.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:54.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:54.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:01:54.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:01:54.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:01:54.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:01:54.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:01:54.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:01:54.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:01:54.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:837: corrupt_and_repair_lrc: create_ec_pool ecpool false k=4 m=2 l=3 plugin=lrc 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:01:54.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=4 m=2 l=3 plugin=lrc 2026-03-08T23:01:54.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:01:54.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:01:55.154 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:01:55.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:01:56.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:01:56.177 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:01:56.177 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:01:56.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:01:56.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:01:56.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:01:56.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:01:56.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836498 2026-03-08T23:01:56.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836498 2026-03-08T23:01:56.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498' 2026-03-08T23:01:56.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:01:56.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672977 2026-03-08T23:01:56.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672977 2026-03-08T23:01:56.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977' 2026-03-08T23:01:56.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:01:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509455 2026-03-08T23:01:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509455 2026-03-08T23:01:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455' 2026-03-08T23:01:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:01:56.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345934 2026-03-08T23:01:56.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345934 2026-03-08T23:01:56.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934' 2026-03-08T23:01:56.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:01:56.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182412 2026-03-08T23:01:56.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182412 2026-03-08T23:01:56.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412' 2026-03-08T23:01:56.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:01:56.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018891 2026-03-08T23:01:56.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018891 2026-03-08T23:01:56.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412 5-128849018891' 2026-03-08T23:01:56.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:56.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:01:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855369 2026-03-08T23:01:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855369 2026-03-08T23:01:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412 5-128849018891 6-150323855369' 2026-03-08T23:01:57.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:57.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:01:57.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691848 2026-03-08T23:01:57.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691848 2026-03-08T23:01:57.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412 5-128849018891 6-150323855369 7-171798691848' 2026-03-08T23:01:57.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:57.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:01:57.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528326 2026-03-08T23:01:57.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528326 2026-03-08T23:01:57.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412 5-128849018891 6-150323855369 7-171798691848 8-193273528326' 2026-03-08T23:01:57.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:01:57.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364804 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364804 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672977 2-64424509455 3-85899345934 4-107374182412 5-128849018891 6-150323855369 7-171798691848 8-193273528326 9-214748364804' 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836498 2026-03-08T23:01:57.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:57.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:01:57.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836498 2026-03-08T23:01:57.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:57.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836498 2026-03-08T23:01:57.290 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836498 2026-03-08T23:01:57.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836498' 2026-03-08T23:01:57.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:57.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836498 2026-03-08T23:01:57.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:01:58.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:01:58.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:01:58.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836499 -lt 21474836498 2026-03-08T23:01:58.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:58.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672977 2026-03-08T23:01:58.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:58.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:01:58.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672977 2026-03-08T23:01:58.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:58.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672977 2026-03-08T23:01:58.656 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672977 2026-03-08T23:01:58.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672977' 2026-03-08T23:01:58.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:01:58.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672977 -lt 42949672977 2026-03-08T23:01:58.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:58.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509455 2026-03-08T23:01:58.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:58.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:01:58.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509455 2026-03-08T23:01:58.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509455 2026-03-08T23:01:58.839 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509455 2026-03-08T23:01:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509455' 2026-03-08T23:01:58.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:01:59.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509456 -lt 64424509455 2026-03-08T23:01:59.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345934 2026-03-08T23:01:59.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:01:59.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345934 2026-03-08T23:01:59.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345934 2026-03-08T23:01:59.028 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345934 2026-03-08T23:01:59.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345934' 2026-03-08T23:01:59.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:01:59.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345934 -lt 85899345934 2026-03-08T23:01:59.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182412 2026-03-08T23:01:59.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:01:59.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182412 2026-03-08T23:01:59.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182412 2026-03-08T23:01:59.214 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.4 seq 107374182412 2026-03-08T23:01:59.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182412' 2026-03-08T23:01:59.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:01:59.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182413 -lt 107374182412 2026-03-08T23:01:59.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018891 2026-03-08T23:01:59.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:01:59.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018891 2026-03-08T23:01:59.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018891 2026-03-08T23:01:59.388 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.5 seq 128849018891 2026-03-08T23:01:59.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018891' 2026-03-08T23:01:59.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:01:59.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018891 -lt 128849018891 2026-03-08T23:01:59.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855369 2026-03-08T23:01:59.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:01:59.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855369 2026-03-08T23:01:59.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855369 2026-03-08T23:01:59.568 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.6 seq 150323855369 2026-03-08T23:01:59.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855369' 2026-03-08T23:01:59.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:01:59.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855370 -lt 150323855369 2026-03-08T23:01:59.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691848 2026-03-08T23:01:59.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:01:59.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691848 2026-03-08T23:01:59.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691848 2026-03-08T23:01:59.741 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.7 seq 171798691848 2026-03-08T23:01:59.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691848' 2026-03-08T23:01:59.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:01:59.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691849 -lt 171798691848 2026-03-08T23:01:59.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:01:59.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528326 2026-03-08T23:01:59.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:01:59.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:01:59.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528326 2026-03-08T23:01:59.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:01:59.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528326 2026-03-08T23:01:59.921 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.8 seq 193273528326 2026-03-08T23:01:59.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528326' 2026-03-08T23:01:59.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:02:00.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528327 -lt 193273528326 2026-03-08T23:02:00.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:00.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364804 2026-03-08T23:02:00.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:00.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:02:00.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364804 2026-03-08T23:02:00.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:00.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364804 2026-03-08T23:02:00.109 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.9 seq 214748364804 2026-03-08T23:02:00.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364804' 2026-03-08T23:02:00.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:02:00.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364805 -lt 214748364804 2026-03-08T23:02:00.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:00.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:00.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:00.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:00.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:00.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:00.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:00.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:00.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:838: corrupt_and_repair_lrc: corrupt_and_repair_erasure_coded td/osd-scrub-repair ecpool 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:248: corrupt_and_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:249: corrupt_and_repair_erasure_coded: local poolname=ecpool 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:251: corrupt_and_repair_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:02:00.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:02:00.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:02:00.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:02:00.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:02:00.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:02:01.164 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:02:01.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:02:01.381 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:02:01.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:02:01.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:02:01.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:02:01.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T23:02:01.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T23:02:01.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:02:01.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:02:01.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:02:01.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: local primary=3 2026-03-08T23:02:01.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T23:02:01.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:02:01.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T23:02:01.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: sed -e s/3// 2026-03-08T23:02:01.625 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:02:01.625 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:9 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:7' 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 9 0 6 2 1 7 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: osds=('5' '9' '0' '6' '2' '1' '7') 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: local -a osds 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:255: corrupt_and_repair_erasure_coded: local not_primary_first=5 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:256: corrupt_and_repair_erasure_coded: local not_primary_second=9 2026-03-08T23:02:01.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:259: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 3 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=3 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:01.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:02:01.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:02:02.577 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:02:03.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:02:03.110 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T23:02:03.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:03.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:02:03.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:02:03.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:03.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:03.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:03.126 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:03.130+0000 7f19381c08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:03.127 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:03.130+0000 7f19381c08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:03.129 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:03.130+0000 7f19381c08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:02:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:03.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:04.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:04.094+0000 7f19381c08c0 -1 Falling back to public interface 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:04.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:02:04.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:05.065 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:05.070+0000 7f19381c08c0 -1 osd.3 66 log_to_monitors true 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:02:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:06.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:02:07.038 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 70 up_thru 70 down_at 67 last_clean_interval [20,66) [v2:127.0.0.1:6826/2755087669,v1:127.0.0.1:6827/2755087669] [v2:127.0.0.1:6828/2755087669,v1:127.0.0.1:6829/2755087669] exists,up df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:02:07.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:07.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:07.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:07.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:07.039 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:07.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:07.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:07.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:07.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:07.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:07.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:07.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:07.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:07.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836501 2026-03-08T23:02:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836501 2026-03-08T23:02:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501' 2026-03-08T23:02:07.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:02:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672980 2026-03-08T23:02:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672980 2026-03-08T23:02:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980' 2026-03-08T23:02:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:02:07.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509459 2026-03-08T23:02:07.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509459 2026-03-08T23:02:07.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459' 2026-03-08T23:02:07.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:02:07.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710722 2026-03-08T23:02:07.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710722 2026-03-08T23:02:07.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722' 2026-03-08T23:02:07.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:02:07.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182416 2026-03-08T23:02:07.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182416 2026-03-08T23:02:07.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416' 2026-03-08T23:02:07.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:02:07.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018894 2026-03-08T23:02:07.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018894 2026-03-08T23:02:07.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416 5-128849018894' 2026-03-08T23:02:07.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:02:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855372 2026-03-08T23:02:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855372 2026-03-08T23:02:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416 5-128849018894 6-150323855372' 2026-03-08T23:02:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:02:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691851 2026-03-08T23:02:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691851 2026-03-08T23:02:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416 5-128849018894 6-150323855372 7-171798691851' 2026-03-08T23:02:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:07.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:02:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528330 2026-03-08T23:02:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528330 2026-03-08T23:02:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416 5-128849018894 6-150323855372 7-171798691851 8-193273528330' 2026-03-08T23:02:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:08.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=214748364808 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 214748364808 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509459 3-300647710722 4-107374182416 5-128849018894 6-150323855372 7-171798691851 8-193273528330 9-214748364808' 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836501 2026-03-08T23:02:08.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836501 2026-03-08T23:02:08.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836501 2026-03-08T23:02:08.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836501' 2026-03-08T23:02:08.175 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836501 2026-03-08T23:02:08.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:08.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836502 -lt 21474836501 2026-03-08T23:02:08.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672980 2026-03-08T23:02:08.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672980 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672980 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672980' 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672980 2026-03-08T23:02:08.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:08.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672980 -lt 42949672980 2026-03-08T23:02:08.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509459 2026-03-08T23:02:08.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:02:08.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509459 2026-03-08T23:02:08.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509459 2026-03-08T23:02:08.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509459' 2026-03-08T23:02:08.599 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509459 2026-03-08T23:02:08.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:02:08.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509459 -lt 64424509459 2026-03-08T23:02:08.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:08.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710722 2026-03-08T23:02:08.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:08.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:02:08.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710722 2026-03-08T23:02:08.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:08.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710722 2026-03-08T23:02:08.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710722' 2026-03-08T23:02:08.810 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 300647710722 2026-03-08T23:02:08.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:02:09.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710722 -lt 300647710722 2026-03-08T23:02:09.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:09.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182416 2026-03-08T23:02:09.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:09.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:02:09.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182416 2026-03-08T23:02:09.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:09.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182416 2026-03-08T23:02:09.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182416' 2026-03-08T23:02:09.005 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182416 2026-03-08T23:02:09.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:02:09.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182415 -lt 107374182416 2026-03-08T23:02:09.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:10.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:02:10.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:02:10.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182416 -lt 107374182416 2026-03-08T23:02:10.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:10.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018894 2026-03-08T23:02:10.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:10.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:02:10.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018894 2026-03-08T23:02:10.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:10.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018894 2026-03-08T23:02:10.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018894' 2026-03-08T23:02:10.374 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 128849018894 2026-03-08T23:02:10.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:02:10.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018894 -lt 128849018894 2026-03-08T23:02:10.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:10.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855372 2026-03-08T23:02:10.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:10.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:02:10.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855372 2026-03-08T23:02:10.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:10.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855372 2026-03-08T23:02:10.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855372' 2026-03-08T23:02:10.554 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 150323855372 2026-03-08T23:02:10.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:02:10.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855373 -lt 150323855372 2026-03-08T23:02:10.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:10.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691851 2026-03-08T23:02:10.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:10.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:02:10.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691851 2026-03-08T23:02:10.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:10.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691851 2026-03-08T23:02:10.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691851' 2026-03-08T23:02:10.725 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 171798691851 2026-03-08T23:02:10.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:02:10.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691852 -lt 171798691851 2026-03-08T23:02:10.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:10.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528330 2026-03-08T23:02:10.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:10.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:02:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528330 2026-03-08T23:02:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:10.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528330 2026-03-08T23:02:10.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528330' 2026-03-08T23:02:10.905 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 193273528330 2026-03-08T23:02:10.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:02:11.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528330 -lt 193273528330 2026-03-08T23:02:11.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:11.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-214748364808 2026-03-08T23:02:11.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:11.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:02:11.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-214748364808 2026-03-08T23:02:11.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:11.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=214748364808 2026-03-08T23:02:11.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 214748364808' 2026-03-08T23:02:11.075 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 214748364808 2026-03-08T23:02:11.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:02:11.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 214748364808 -lt 214748364808 2026-03-08T23:02:11.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:11.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:11.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:11.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:11.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:11.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:11.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:11.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:11.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:11.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:11.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:02:11.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:02:12.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T23:02:12.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T23:02:12.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:02:12.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:02:12.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:12.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:12.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:12.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:12.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:12.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:02:12.410 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:02:12.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:12.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:12.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:12.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:12.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:12.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:12.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:01:55.156210+0000 '>' 2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:12.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:13.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:13.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:01:55.156210+0000 '>' 2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:13.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:14.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:01:55.156210+0000 '>' 2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:14.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:15.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:01:55.156210+0000 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:02:16.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:16.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:16.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:16.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:16.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:16.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:02:16.605 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:02:16.605 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T23:02:16.605 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:16.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:16.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:16.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:02:16.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:02:16.900 INFO:tasks.workunit.client.0.vm03.stderr:start osd.9 2026-03-08T23:02:16.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:16.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:02:16.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:02:16.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:16.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:16.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:16.918 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:16.918+0000 7f99e6cb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:16.918 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:16.922+0000 7f99e6cb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:16.920 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:16.922+0000 7f99e6cb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:17.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:17.878 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:17.882+0000 7f99e6cb48c0 -1 Falling back to public interface 2026-03-08T23:02:18.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:18.340 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:18.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:18.878 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:18.882+0000 7f99e6cb48c0 -1 osd.9 71 log_to_monitors true 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:19.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:19.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:20.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:osd.9 up in weight 1 up_from 75 up_thru 0 down_at 72 last_clean_interval [50,71) [v2:127.0.0.1:6874/2491738046,v1:127.0.0.1:6875/2491738046] [v2:127.0.0.1:6876/2491738046,v1:127.0.0.1:6877/2491738046] exists,up 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:20.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:20.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:20.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:20.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:20.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:20.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:20.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:21.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T23:02:21.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T23:02:21.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T23:02:21.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:02:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672984 2026-03-08T23:02:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672984 2026-03-08T23:02:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984' 2026-03-08T23:02:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:02:21.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509462 2026-03-08T23:02:21.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509462 2026-03-08T23:02:21.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462' 2026-03-08T23:02:21.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:02:21.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710726 2026-03-08T23:02:21.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710726 2026-03-08T23:02:21.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726' 2026-03-08T23:02:21.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:02:21.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182419 2026-03-08T23:02:21.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182419 2026-03-08T23:02:21.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419' 2026-03-08T23:02:21.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:02:21.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018898 2026-03-08T23:02:21.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018898 2026-03-08T23:02:21.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419 5-128849018898' 2026-03-08T23:02:21.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:02:21.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855376 2026-03-08T23:02:21.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855376 2026-03-08T23:02:21.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419 5-128849018898 6-150323855376' 2026-03-08T23:02:21.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:02:21.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691855 2026-03-08T23:02:21.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691855 2026-03-08T23:02:21.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419 5-128849018898 6-150323855376 7-171798691855' 2026-03-08T23:02:21.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:02:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528333 2026-03-08T23:02:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528333 2026-03-08T23:02:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419 5-128849018898 6-150323855376 7-171798691855 8-193273528333' 2026-03-08T23:02:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:21.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547202 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547202 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672984 2-64424509462 3-300647710726 4-107374182419 5-128849018898 6-150323855376 7-171798691855 8-193273528333 9-322122547202' 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T23:02:22.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:22.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:02:22.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T23:02:22.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:22.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T23:02:22.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T23:02:22.089 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836505 2026-03-08T23:02:22.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:22.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T23:02:22.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:22.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672984 2026-03-08T23:02:22.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:22.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:02:22.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672984 2026-03-08T23:02:22.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:22.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672984 2026-03-08T23:02:22.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672984' 2026-03-08T23:02:22.283 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672984 2026-03-08T23:02:22.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:22.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672984 -lt 42949672984 2026-03-08T23:02:22.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:22.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509462 2026-03-08T23:02:22.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:22.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:02:22.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509462 2026-03-08T23:02:22.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:22.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509462 2026-03-08T23:02:22.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509462' 2026-03-08T23:02:22.472 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509462 2026-03-08T23:02:22.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:02:22.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509462 -lt 64424509462 2026-03-08T23:02:22.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:22.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710726 2026-03-08T23:02:22.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:22.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:02:22.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710726 2026-03-08T23:02:22.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:22.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710726 2026-03-08T23:02:22.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710726' 2026-03-08T23:02:22.665 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 300647710726 2026-03-08T23:02:22.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:02:22.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710726 -lt 300647710726 2026-03-08T23:02:22.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:22.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182419 2026-03-08T23:02:22.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:22.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:02:22.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182419 2026-03-08T23:02:22.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:22.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182419 2026-03-08T23:02:22.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182419' 2026-03-08T23:02:22.864 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182419 2026-03-08T23:02:22.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:02:23.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182419 -lt 107374182419 2026-03-08T23:02:23.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:23.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-128849018898 2026-03-08T23:02:23.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:23.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:02:23.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-128849018898 2026-03-08T23:02:23.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:23.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018898 2026-03-08T23:02:23.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 128849018898' 2026-03-08T23:02:23.072 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 128849018898 2026-03-08T23:02:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:02:23.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018897 -lt 128849018898 2026-03-08T23:02:23.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:02:24.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:02:24.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018898 -lt 128849018898 2026-03-08T23:02:24.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:24.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855376 2026-03-08T23:02:24.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:24.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:02:24.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855376 2026-03-08T23:02:24.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:24.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855376 2026-03-08T23:02:24.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855376' 2026-03-08T23:02:24.453 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 150323855376 2026-03-08T23:02:24.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:02:24.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855376 -lt 150323855376 2026-03-08T23:02:24.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:24.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691855 2026-03-08T23:02:24.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:24.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:02:24.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691855 2026-03-08T23:02:24.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:24.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691855 2026-03-08T23:02:24.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691855' 2026-03-08T23:02:24.643 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 171798691855 2026-03-08T23:02:24.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:02:24.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691855 -lt 171798691855 2026-03-08T23:02:24.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:24.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528333 2026-03-08T23:02:24.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:24.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:02:24.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528333 2026-03-08T23:02:24.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:24.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528333 2026-03-08T23:02:24.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528333' 2026-03-08T23:02:24.839 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 193273528333 2026-03-08T23:02:24.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:02:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528334 -lt 193273528333 2026-03-08T23:02:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:25.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-322122547202 2026-03-08T23:02:25.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:02:25.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-322122547202 2026-03-08T23:02:25.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:25.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547202 2026-03-08T23:02:25.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 322122547202' 2026-03-08T23:02:25.035 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 322122547202 2026-03-08T23:02:25.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:02:25.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547202 -lt 322122547202 2026-03-08T23:02:25.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:25.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:25.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:25.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:25.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:25.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:25.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:25.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:25.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:25.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:25.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:25.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:02:25.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:261: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 5 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=5 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:25.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:25.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:25.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:25.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:02:26.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:02:26.840 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:02:27.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:02:27.377 INFO:tasks.workunit.client.0.vm03.stderr:start osd.5 2026-03-08T23:02:27.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:27.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:02:27.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:02:27.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:27.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:27.401 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:27.402+0000 7f88ccea18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:27.402 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:27.406+0000 7f88ccea18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:27.408 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:27.410+0000 7f88ccea18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:27.572 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:02:27.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:27.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:27.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:27.866 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:27.870+0000 7f88ccea18c0 -1 Falling back to public interface 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:28.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:28.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:28.850+0000 7f88ccea18c0 -1 osd.5 76 log_to_monitors true 2026-03-08T23:02:28.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:29.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:30.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:31.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:31.321 INFO:tasks.workunit.client.0.vm03.stderr:osd.5 up in weight 1 up_from 80 up_thru 80 down_at 77 last_clean_interval [30,76) [v2:127.0.0.1:6842/1714259639,v1:127.0.0.1:6843/1714259639] [v2:127.0.0.1:6844/1714259639,v1:127.0.0.1:6845/1714259639] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:31.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:31.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:31.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:31.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836508 2026-03-08T23:02:31.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836508 2026-03-08T23:02:31.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508' 2026-03-08T23:02:31.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:31.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:02:31.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672987 2026-03-08T23:02:31.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672987 2026-03-08T23:02:31.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987' 2026-03-08T23:02:31.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:31.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:02:31.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509465 2026-03-08T23:02:31.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509465 2026-03-08T23:02:31.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465' 2026-03-08T23:02:31.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:31.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:02:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710729 2026-03-08T23:02:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710729 2026-03-08T23:02:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729' 2026-03-08T23:02:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:02:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182423 2026-03-08T23:02:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182423 2026-03-08T23:02:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423' 2026-03-08T23:02:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:02:32.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=343597383682 2026-03-08T23:02:32.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 343597383682 2026-03-08T23:02:32.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423 5-343597383682' 2026-03-08T23:02:32.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:02:32.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855379 2026-03-08T23:02:32.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855379 2026-03-08T23:02:32.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423 5-343597383682 6-150323855379' 2026-03-08T23:02:32.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:02:32.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691858 2026-03-08T23:02:32.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691858 2026-03-08T23:02:32.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423 5-343597383682 6-150323855379 7-171798691858' 2026-03-08T23:02:32.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:02:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528336 2026-03-08T23:02:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528336 2026-03-08T23:02:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423 5-343597383682 6-150323855379 7-171798691858 8-193273528336' 2026-03-08T23:02:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:32.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:02:32.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547205 2026-03-08T23:02:32.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547205 2026-03-08T23:02:32.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672987 2-64424509465 3-300647710729 4-107374182423 5-343597383682 6-150323855379 7-171798691858 8-193273528336 9-322122547205' 2026-03-08T23:02:32.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:32.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836508 2026-03-08T23:02:32.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:32.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:02:32.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836508 2026-03-08T23:02:32.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:32.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836508 2026-03-08T23:02:32.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836508' 2026-03-08T23:02:32.580 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836508 2026-03-08T23:02:32.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:32.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836508 2026-03-08T23:02:32.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:32.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:32.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672987 2026-03-08T23:02:32.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:02:32.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672987 2026-03-08T23:02:32.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:32.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672987 2026-03-08T23:02:32.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672987' 2026-03-08T23:02:32.766 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672987 2026-03-08T23:02:32.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:32.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672986 -lt 42949672987 2026-03-08T23:02:32.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:33.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:02:33.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:34.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672988 -lt 42949672987 2026-03-08T23:02:34.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:34.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509465 2026-03-08T23:02:34.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:34.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:02:34.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509465 2026-03-08T23:02:34.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:34.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509465 2026-03-08T23:02:34.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509465' 2026-03-08T23:02:34.124 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509465 2026-03-08T23:02:34.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:02:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509466 -lt 64424509465 2026-03-08T23:02:34.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:34.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710729 2026-03-08T23:02:34.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:34.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:02:34.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710729 2026-03-08T23:02:34.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:34.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710729 2026-03-08T23:02:34.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710729' 2026-03-08T23:02:34.305 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 300647710729 2026-03-08T23:02:34.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:02:34.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710729 -lt 300647710729 2026-03-08T23:02:34.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:34.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182423 2026-03-08T23:02:34.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:34.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:02:34.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182423 2026-03-08T23:02:34.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:34.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182423 2026-03-08T23:02:34.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182423' 2026-03-08T23:02:34.488 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182423 2026-03-08T23:02:34.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:02:34.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182423 -lt 107374182423 2026-03-08T23:02:34.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:34.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-343597383682 2026-03-08T23:02:34.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:34.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:02:34.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-343597383682 2026-03-08T23:02:34.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:34.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=343597383682 2026-03-08T23:02:34.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 343597383682' 2026-03-08T23:02:34.673 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 343597383682 2026-03-08T23:02:34.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:02:34.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 343597383682 -lt 343597383682 2026-03-08T23:02:34.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:34.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855379 2026-03-08T23:02:34.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:34.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:02:34.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855379 2026-03-08T23:02:34.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:34.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855379 2026-03-08T23:02:34.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855379' 2026-03-08T23:02:34.861 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 150323855379 2026-03-08T23:02:34.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:02:35.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855379 -lt 150323855379 2026-03-08T23:02:35.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:35.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691858 2026-03-08T23:02:35.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:35.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:02:35.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691858 2026-03-08T23:02:35.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:35.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691858 2026-03-08T23:02:35.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691858' 2026-03-08T23:02:35.038 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 171798691858 2026-03-08T23:02:35.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:02:35.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691858 -lt 171798691858 2026-03-08T23:02:35.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:35.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528336 2026-03-08T23:02:35.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:35.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:02:35.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:35.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528336 2026-03-08T23:02:35.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528336 2026-03-08T23:02:35.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528336' 2026-03-08T23:02:35.216 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 193273528336 2026-03-08T23:02:35.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:02:35.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528337 -lt 193273528336 2026-03-08T23:02:35.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:35.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-322122547205 2026-03-08T23:02:35.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:35.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:02:35.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-322122547205 2026-03-08T23:02:35.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:35.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547205 2026-03-08T23:02:35.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 322122547205' 2026-03-08T23:02:35.398 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 322122547205 2026-03-08T23:02:35.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:02:35.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547205 -lt 322122547205 2026-03-08T23:02:35.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:35.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:35.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:35.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:35.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:35.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:35.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:35.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:35.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:35.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:02:36.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:02:36.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:02:36.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:02:36.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:02:36.744 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:36.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:36.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:36.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:37.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:37.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:37.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:37.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:37.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:37.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:37.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:38.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:38.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:39.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:39.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:40.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:40.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:40.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:40.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:40.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:40.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:40.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:40.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:40.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:41.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:12.987356+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:41.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:02:42.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:02:42.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:12.987356+0000 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:42.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:43.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:02:43.369 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:02:43.369 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T23:02:43.369 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:43.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:43.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:02:43.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:02:43.655 INFO:tasks.workunit.client.0.vm03.stderr:start osd.9 2026-03-08T23:02:43.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:43.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:02:43.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:02:43.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:43.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:43.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:43.673 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:43.675+0000 7f2c2684d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:43.673 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:43.679+0000 7f2c2684d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:43.674 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:43.679+0000 7f2c2684d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:43.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:44.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:44.381 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:44.387+0000 7f2c2684d8c0 -1 Falling back to public interface 2026-03-08T23:02:45.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:45.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:45.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:45.053 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:45.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:45.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:45.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:45.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:45.743+0000 7f2c2684d8c0 -1 osd.9 81 log_to_monitors true 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:46.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:46.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:47.434 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:47.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:47.436 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:02:47.439+0000 7f2c1d7fd640 -1 osd.9 81 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:02:47.643 INFO:tasks.workunit.client.0.vm03.stderr:osd.9 up in weight 1 up_from 85 up_thru 0 down_at 82 last_clean_interval [75,81) [v2:127.0.0.1:6874/3709332694,v1:127.0.0.1:6875/3709332694] [v2:127.0.0.1:6876/3709332694,v1:127.0.0.1:6877/3709332694] exists,up 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:47.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:47.645 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:47.645 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:47.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:47.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:47.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:47.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:02:47.909 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:02:47.910 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:02:47.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:47.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:47.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:47.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836513 2026-03-08T23:02:47.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836513 2026-03-08T23:02:47.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513' 2026-03-08T23:02:47.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:47.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:02:48.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672991 2026-03-08T23:02:48.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672991 2026-03-08T23:02:48.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991' 2026-03-08T23:02:48.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:02:48.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509470 2026-03-08T23:02:48.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509470 2026-03-08T23:02:48.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470' 2026-03-08T23:02:48.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:02:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710733 2026-03-08T23:02:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710733 2026-03-08T23:02:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733' 2026-03-08T23:02:48.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:02:48.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182427 2026-03-08T23:02:48.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182427 2026-03-08T23:02:48.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427' 2026-03-08T23:02:48.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:02:48.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=343597383686 2026-03-08T23:02:48.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 343597383686 2026-03-08T23:02:48.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427 5-343597383686' 2026-03-08T23:02:48.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:02:48.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855383 2026-03-08T23:02:48.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855383 2026-03-08T23:02:48.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427 5-343597383686 6-150323855383' 2026-03-08T23:02:48.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:02:48.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691862 2026-03-08T23:02:48.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691862 2026-03-08T23:02:48.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427 5-343597383686 6-150323855383 7-171798691862' 2026-03-08T23:02:48.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:02:48.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528341 2026-03-08T23:02:48.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528341 2026-03-08T23:02:48.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427 5-343597383686 6-150323855383 7-171798691862 8-193273528341' 2026-03-08T23:02:48.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:48.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=365072220162 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 365072220162 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836513 1-42949672991 2-64424509470 3-300647710733 4-107374182427 5-343597383686 6-150323855383 7-171798691862 8-193273528341 9-365072220162' 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836513 2026-03-08T23:02:48.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:48.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:02:48.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836513 2026-03-08T23:02:48.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836513 2026-03-08T23:02:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836513' 2026-03-08T23:02:48.803 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836513 2026-03-08T23:02:48.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:49.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836513 2026-03-08T23:02:49.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:02:50.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:02:50.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:02:50.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836513 -lt 21474836513 2026-03-08T23:02:50.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:50.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672991 2026-03-08T23:02:50.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:50.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:02:50.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672991 2026-03-08T23:02:50.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:50.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672991 2026-03-08T23:02:50.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672991' 2026-03-08T23:02:50.255 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672991 2026-03-08T23:02:50.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:02:50.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672992 -lt 42949672991 2026-03-08T23:02:50.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:50.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509470 2026-03-08T23:02:50.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:02:50.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509470 2026-03-08T23:02:50.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509470 2026-03-08T23:02:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509470' 2026-03-08T23:02:50.428 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509470 2026-03-08T23:02:50.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:02:50.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509470 -lt 64424509470 2026-03-08T23:02:50.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:50.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710733 2026-03-08T23:02:50.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:50.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:02:50.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710733 2026-03-08T23:02:50.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:50.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710733 2026-03-08T23:02:50.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710733' 2026-03-08T23:02:50.638 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 300647710733 2026-03-08T23:02:50.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:02:50.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710733 -lt 300647710733 2026-03-08T23:02:50.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:50.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182427 2026-03-08T23:02:50.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:50.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:02:50.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182427 2026-03-08T23:02:50.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:50.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182427 2026-03-08T23:02:50.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182427' 2026-03-08T23:02:50.820 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182427 2026-03-08T23:02:50.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:02:50.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182427 -lt 107374182427 2026-03-08T23:02:50.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-343597383686 2026-03-08T23:02:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:50.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:02:50.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-343597383686 2026-03-08T23:02:50.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:50.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=343597383686 2026-03-08T23:02:50.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 343597383686' 2026-03-08T23:02:50.993 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 343597383686 2026-03-08T23:02:50.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:02:51.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 343597383687 -lt 343597383686 2026-03-08T23:02:51.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:51.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855383 2026-03-08T23:02:51.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:51.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:02:51.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855383 2026-03-08T23:02:51.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:51.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855383 2026-03-08T23:02:51.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855383' 2026-03-08T23:02:51.173 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 150323855383 2026-03-08T23:02:51.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:02:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855384 -lt 150323855383 2026-03-08T23:02:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691862 2026-03-08T23:02:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:51.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:02:51.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691862 2026-03-08T23:02:51.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:51.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691862 2026-03-08T23:02:51.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691862' 2026-03-08T23:02:51.351 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 171798691862 2026-03-08T23:02:51.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:02:51.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691863 -lt 171798691862 2026-03-08T23:02:51.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:51.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528341 2026-03-08T23:02:51.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:51.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:02:51.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528341 2026-03-08T23:02:51.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:51.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528341 2026-03-08T23:02:51.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528341' 2026-03-08T23:02:51.531 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 193273528341 2026-03-08T23:02:51.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:02:51.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528341 -lt 193273528341 2026-03-08T23:02:51.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:02:51.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-365072220162 2026-03-08T23:02:51.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:02:51.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:02:51.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-365072220162 2026-03-08T23:02:51.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:02:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=365072220162 2026-03-08T23:02:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 365072220162' 2026-03-08T23:02:51.708 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 365072220162 2026-03-08T23:02:51.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:02:51.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 365072220163 -lt 365072220162 2026-03-08T23:02:51.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:02:51.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:51.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:02:52.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:02:52.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:02:52.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:02:52.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:02:52.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:02:52.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:02:52.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:02:52.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:02:52.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:02:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:262: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 5 9 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=5 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=9 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 184331"' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 184331' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 184332"' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 184332' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 184331 184332' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 184331 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/184334: /' 2026-03-08T23:02:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/184336: /' 2026-03-08T23:02:52.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:02:52.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:53.788 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:53.789 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: start osd.5 2026-03-08T23:02:53.790 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:02:54.001 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING remove 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: remove 2#2:eb822e21:::SOMETHING:head# 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:02:54.002 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:02:54.004 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: start osd.9 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:02:54.005 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2026-03-08T23:02:54.051+0000 7f355ff388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2026-03-08T23:02:54.075+0000 7f355ff388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2026-03-08T23:02:54.083+0000 7f355ff388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.192 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 0 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2026-03-08T23:02:55.047+0000 7f355ff388c0 -1 Falling back to public interface 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2026-03-08T23:02:56.067+0000 7f355ff388c0 -1 osd.9 86 log_to_monitors true 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: 3 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.193 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:53.811+0000 7f6f26d388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:53.811+0000 7f6f26d388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:53.815+0000 7f6f26d388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 0 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:54.767+0000 7f6f26d388c0 -1 Falling back to public interface 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 1 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:55.971+0000 7f6f26d388c0 -1 osd.5 86 log_to_monitors true 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:58.220 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: 3 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:58.221 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: 4 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: osd.9 up in weight 1 up_from 93 up_thru 0 down_at 87 last_clean_interval [85,86) [v2:127.0.0.1:6874/2550306189,v1:127.0.0.1:6875/2550306189] [v2:127.0.0.1:6876/2550306189,v1:127.0.0.1:6877/2550306189] exists,up 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:59.749 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 1 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 2 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 3 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 4 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 5 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 6 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 7 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 8 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: 9' 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836516 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836516 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516' 2026-03-08T23:02:59.750 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184336: /s.sh:985: wait_for_osd: sleep 1 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2026-03-08T23:02:58.535+0000 7f6f1dce8640 -1 osd.5 86 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 4 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: osd.5 up in weight 1 up_from 92 up_thru 92 down_at 87 last_clean_interval [80,86) [v2:127.0.0.1:6842/1433999526,v1:127.0.0.1:6843/1433999526] [v2:127.0.0.1:6844/1433999526,v1:127.0.0.1:6845/1433999526] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 1 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 2 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 3 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 4 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 5 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 6 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 7 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 8 2026-03-08T23:02:59.771 INFO:tasks.workunit.client.0.vm03.stderr:184334: 9' 2026-03-08T23:02:59.772 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:02:59.772 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:02:59.772 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:02:59.772 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836517 2026-03-08T23:02:59.772 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836517 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517' 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672996 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672996 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996' 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509473 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509473 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473' 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710736 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710736 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736' 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.363 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182430 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182430 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430' 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991235 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991235 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430 5-395136991235' 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855388 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855388 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430 5-395136991235 6-150323855388' 2026-03-08T23:03:00.364 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standal/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672995 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672995 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509474 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509474 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710737 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710737 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182431 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182431 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=395136991234 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 395136991234 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431 5-395136991234' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855387 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855387 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431 5-395136991234 6-150323855387' 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:00.453 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691866 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691866 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431 5-395136991234 6-150323855387 7-171798691866' 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528344 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528344 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431 5-395136991234 6-150323855387 7-171798691866 8-193273528344' 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=399431958530 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 399431958530 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509474 3-300647710737 4-107374182431 5-395136991234 6-150323855387 7-171798691866 8-193273528344 9-399431958530' 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836516 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836516 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836516 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836516' 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.0 seq 21474836516 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836515 -lt 21474836516 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:02.050 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836516 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672995 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:02.051 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691867 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691867 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430 5-395136991235 6-150323855388 7-171798691867' 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528345 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528345 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430 5-395136991235 6-150323855388 7-171798691867 8-193273528345' 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=399431958531 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 399431958531 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836517 1-42949672996 2-64424509473 3-300647710736 4-107374182430 5-395136991235 6-150323855388 7-171798691867 8-193273528345 9-399431958531' 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836517' 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.0 seq 21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836515 -lt 21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836517 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.057 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672996 2026-03-08T23:03:02.675 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2u/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672995 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672995 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672995' 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.1 seq 42949672995 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672996 -lt 42949672995 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509474 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509474 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509474 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509474' 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.2 seq 64424509474 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509474 -lt 64424509474 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710737 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710737 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710737 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710737' 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.3 seq 300647710737 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710738 -lt 300647710737 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182431 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182431 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182431 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182431' 2026-03-08T23:03:02.676 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.4 seq 107374182431 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone274: flush_pg_stats: osd=1 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672996 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672996 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672996' 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.1 seq 42949672996 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672996 -lt 42949672996 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509473 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509473 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509473 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509473' 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.2 seq 64424509473 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509474 -lt 64424509473 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710736 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710736 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710736 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710736' 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.3 seq 300647710736 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710738 -lt 300647710736 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182430 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182430 2026-03-08T23:03:02.678 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182430 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/st.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182431 -lt 107374182431 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-395136991234 2026-03-08T23:03:03.441 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-395136991234 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991234 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 395136991234' 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.5 seq 395136991234 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991236 -lt 395136991234 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855387 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855387 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855387 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855387' 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.6 seq 150323855387 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855388 -lt 150323855387 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691866 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691866 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691866 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691866' 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.7 seq 171798691866 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691867 -lt 171798691866 2026-03-08T23:03:03.442 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:22andalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182430' 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.4 seq 107374182430 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182431 -lt 107374182430 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-395136991235 2026-03-08T23:03:03.452 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-395136991235 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=395136991235 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 395136991235' 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.5 seq 395136991235 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 395136991236 -lt 395136991235 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855388 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855388 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855388 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855388' 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.6 seq 150323855388 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855388 -lt 150323855388 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691867 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691867 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691867 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691867' 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.7 seq 171798691867 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:03.453 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691867 -lt 171798691867 2026-03-08T23:03:04.267 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528345 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528345 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528345 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528345' 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.8 seq 193273528345 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528345 -lt 193273528345 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-399431958531 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-399431958531 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=399431958531 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 399431958531' 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: waiting osd.9 seq 399431958531 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958532 -lt 399431958531 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:04.268 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ce74: flush_pg_stats: echo 8-193273528344 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528344 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528344 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528344' 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.8 seq 193273528344 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528345 -lt 193273528344 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-399431958530 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-399431958530 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=399431958530 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 399431958530' 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: waiting osd.9 seq 399431958530 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958532 -lt 399431958530 2026-03-08T23:03:04.273 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:04.274 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184334: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:184334: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 184332 2026-03-08T23:03:04.509 INFO:tasks.workunit.client.0.vm03.stderr:s 2026-03-08T23:03:04.509 INFO:tasks.workunit.client.0.vm03.stderr:184336: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:04.509 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:04.509 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:184336: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:03:04.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:04.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:04.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:04.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:04.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:04.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:03:05.029 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:05.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:05.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:05.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:06.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:06.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:06.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:06.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:06.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:06.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:06.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:06.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:06.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:07.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:07.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:07.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:08.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:08.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:08.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:08.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:09.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:09.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:09.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:10.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:10.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:10.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:10.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:10.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:10.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:10.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:11.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:02:37.140827+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:11.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:12.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:03:05.265770+0000 '>' 2026-03-08T23:02:37.140827+0000 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 187884"' 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 187884' 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:12.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 187885"' 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 187885' 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 187884 187885' 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 187884 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/187887: /' 2026-03-08T23:03:12.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/187889: /' 2026-03-08T23:03:12.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:12.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:03:13.241 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING list-attrs 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: _ 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: hinfo_key 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: snapset 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:03:13.248 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: start osd.5 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_rec187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:13.249 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: _ 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: hinfo_key 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: snapset 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:13.250 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: start osd.9 2026-03-08T23:03:13.266 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460overy_ops 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:17.444 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:13.271+0000 7fe5ba1228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:13.295+0000 7fe5ba1228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:13.307+0000 7fe5ba1228c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 0 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:14.275+0000 7fe5ba1228c0 -1 Falling back to public interface 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:15.335+0000 7fe5ba1228c0 -1 osd.5 94 log_to_monitors true 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: 3 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.445 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/c --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2026-03-08T23:03:13.287+0000 7f4d7be368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2026-03-08T23:03:13.295+0000 7f4d7be368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2026-03-08T23:03:13.303+0000 7f4d7be368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 0 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2026-03-08T23:03:14.259+0000 7f4d7be368c0 -1 Falling back to public interface 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 1 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2026-03-08T23:03:15.331+0000 7f4d7be368c0 -1 osd.9 94 log_to_monitors true 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: 3 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:03:17.488 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: 4 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: osd.9 up in weight 1 up_from 98 up_thru 0 down_at 95 last_clean_interval [93,94) [v2:127.0.0.1:6842/2307058483,v1:127.0.0.1:6843/2307058483] [v2:127.0.0.1:6844/2307058483,v1:127.0.0.1:6845/2307058483] exists,up 481e3c8d-7f04-4b16-a1ed-4401e8c9a642 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:19.161 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 1 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 2 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 3 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 4 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 5 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 6 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 7 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 8 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: 9' 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836522 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836522 2026-03-08T23:03:19.162 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522' 2026-03-08T23:03:19.185 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $idseph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2026-03-08T23:03:18.443+0000 7fe5b10d2640 -1 osd.5 94 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 4 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: osd.5 up in weight 1 up_from 98 up_thru 98 down_at 95 last_clean_interval [92,94) [v2:127.0.0.1:6874/4115425458,v1:127.0.0.1:6875/4115425458] [v2:127.0.0.1:6876/4115425458,v1:127.0.0.1:6877/4115425458] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 1 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 2 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 3 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 4 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 5 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 6 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 7 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 8 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: 9' 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836523 2026-03-08T23:03:19.186 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836523 2026-03-08T23:03:19.941 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523' 2026-03-08T23:03:19.941 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.941 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:19.941 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673002 2026-03-08T23:03:19.941 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673002 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509480 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509480 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710743 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710743 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182437 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182437 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795011 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795011 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437 5-420906795011' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855394 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855394 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437 5-420906795011 6-150323855394' 2026-03-08T23:03:19.942 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/ 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673001 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673001 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001' 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509479 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509479 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479' 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=300647710742 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 300647710742 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742' 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182436 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182436 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436' 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795010 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795010 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436 5-420906795010' 2026-03-08T23:03:20.063 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855393 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855393 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436 5-420906795010 6-150323855393' 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.064 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:22qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691872 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691872 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437 5-420906795011 6-150323855394 7-171798691872' 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528350 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528350 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437 5-420906795011 6-150323855394 7-171798691872 8-193273528350' 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795010 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795010 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949673002 2-64424509480 3-300647710743 4-107374182437 5-420906795011 6-150323855394 7-171798691872 8-193273528350 9-420906795010' 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836523 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836523 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836523 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836523' 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.0 seq 21474836523 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836523 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673002 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673002 2026-03-08T23:03:20.472 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673002 2026-03-08T23:03:20.486 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stat65: flush_pg_stats: seq=171798691873 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691873 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436 5-420906795010 6-150323855393 7-171798691873' 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528351 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528351 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436 5-420906795010 6-150323855393 7-171798691873 8-193273528351' 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795011 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795011 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949673001 2-64424509479 3-300647710742 4-107374182436 5-420906795010 6-150323855393 7-171798691873 8-193273528351 9-420906795011' 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836522 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836522 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836522 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836522' 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.0 seq 21474836522 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836522 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673001 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673001 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673001 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673001' 2026-03-08T23:03:20.487 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.1 seq 42949673001 2026-03-08T23:03:21.305 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:21.305 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673002 -lt 42949673001 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509479 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509479 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509479 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509479' 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.2 seq 64424509479 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509480 -lt 64424509479 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710742 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710742 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710742 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710742' 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.3 seq 300647710742 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710743 -lt 300647710742 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182436 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182436 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182436 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182436' 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.4 seq 107374182436 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182437 -lt 107374182436 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.306 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubunts: echo 'waiting osd.1 seq 42949673002' 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.1 seq 42949673002 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673002 -lt 42949673002 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509480 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509480 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509480 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509480' 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.2 seq 64424509480 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509480 -lt 64424509480 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-300647710743 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-300647710743 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=300647710743 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 300647710743' 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.3 seq 300647710743 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 300647710743 -lt 300647710743 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182437 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182437 2026-03-08T23:03:21.317 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182437 2026-03-08T23:03:21.318 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182437' 2026-03-08T23:03:21.318 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.4 seq 107374182437 2026-03-08T23:03:21.318 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:21.318 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182437 -lt 107374182437 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flusu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-420906795010 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-420906795010 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795010 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 420906795010' 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.5 seq 420906795010 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795010 -lt 420906795010 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855393 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855393 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855393 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855393' 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.6 seq 150323855393 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:23.171 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855392 -lt 150323855393 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855394 -lt 150323855393 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691873 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691873 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691873 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691873' 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.7 seq 171798691873 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691873 -lt 171798691873 2026-03-08T23:03:23.172 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-hh_pg_stats: for s in $seqs 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 420906795011' 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.5 seq 420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795010 -lt 420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795012 -lt 420906795011 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855394 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855394 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855394 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855394' 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.6 seq 150323855394 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855394 -lt 150323855394 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691872 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691872 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691872 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691872' 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.7 seq 171798691872 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:23.178 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691873 -lt 171798691872 2026-03-08T23:03:23.992 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528350 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528350 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528350 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528350' 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.8 seq 193273528350 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528351 -lt 193273528350 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795010 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795010 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795010 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795010' 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: waiting osd.9 seq 420906795010 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795012 -lt 420906795010 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:23.993 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/selpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528351 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528351 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528351 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528351' 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.8 seq 193273528351 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528351 -lt 193273528351 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795011 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:24.015 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795011 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795011 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795011' 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: waiting osd.9 seq 420906795011 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795012 -lt 420906795011 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:24.016 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .tandalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187887: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:187887: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:24.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 187885 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:pgmap.num_pgs 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:187889: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:187889: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:03:24.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:03:24.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:263: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 3 5 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=3 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=5 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 190851"' 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 190851' 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 190853"' 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 190853' 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 190851 190853' 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:03:24.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:24.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 190851 2026-03-08T23:03:24.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/190854: /' 2026-03-08T23:03:24.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/190856: /' 2026-03-08T23:03:24.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:03:24.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:25.773 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:25.775 INFO:tasks.workunit.client.0.vm03.stderr:190854: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: start osd.3 2026-03-08T23:03:25.776 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:25.801 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:25.807 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: start osd.5 2026-03-08T23:03:25.808 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2026-03-08T23:03:25.794+0000 7f5fb5b178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2026-03-08T23:03:25.798+0000 7f5fb5b178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2026-03-08T23:03:25.798+0000 7f5fb5b178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 0 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2026-03-08T23:03:27.002+0000 7f5fb5b178c0 -1 Falling back to public interface 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 1 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2026-03-08T23:03:28.310+0000 7f5fb5b178c0 -1 osd.3 100 log_to_monitors true 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.806 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: 3 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.807 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:29.825 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpe190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2026-03-08T23:03:25.838+0000 7f3cf29798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2026-03-08T23:03:25.862+0000 7f3cf29798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2026-03-08T23:03:25.870+0000 7f3cf29798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 0 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2026-03-08T23:03:27.090+0000 7f3cf29798c0 -1 Falling back to public interface 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 1 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2026-03-08T23:03:28.302+0000 7f3cf29798c0 -1 osd.5 100 log_to_monitors true 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2 2026-03-08T23:03:29.827 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: 3 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:29.828 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:31.337 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:31.337 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:31.337 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 4 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: osd.3 up in weight 1 up_from 104 up_thru 104 down_at 101 last_clean_interval [70,100) [v2:127.0.0.1:6826/218443388,v1:127.0.0.1:6827/218443388] [v2:127.0.0.1:6828/218443388,v1:127.0.0.1:6829/218443388] exists,up df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 1 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 2 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 3 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 4 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 5 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 6 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 7 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 8 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: 9' 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836526 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836526 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526' 2026-03-08T23:03:31.338 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190854:rs.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 4 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: osd.5 up in weight 1 up_from 104 up_thru 104 down_at 101 last_clean_interval [98,100) [v2:127.0.0.1:6874/73644110,v1:127.0.0.1:6875/73644110] [v2:127.0.0.1:6876/73644110,v1:127.0.0.1:6877/73644110] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 1 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 2 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 3 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 4 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 5 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 6 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 7 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 8 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: 9' 2026-03-08T23:03:31.357 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836527 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836527 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527' 2026-03-08T23:03:31.358 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673006 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673006 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509484 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509484 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598786 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598786 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182440 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182440 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598786 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598786 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440 5-446676598786' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855397 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855397 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440 5-446676598786 6-150323855397' 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.018 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673005 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673005 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005' 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.039 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509483 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509483 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483' 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598787 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598787 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787' 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182441 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182441 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441' 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598787 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598787 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441 5-446676598787' 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855398 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855398 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441 5-446676598787 6-150323855398' 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.040 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:32.404 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flus_stats: seq=171798691876 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691876 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440 5-446676598786 6-150323855397 7-171798691876' 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528354 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528354 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440 5-446676598786 6-150323855397 7-171798691876 8-193273528354' 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795015 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795015 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673006 2-64424509484 3-446676598786 4-107374182440 5-446676598786 6-150323855397 7-171798691876 8-193273528354 9-420906795015' 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836527 2026-03-08T23:03:32.410 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836527 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836527 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836527' 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.0 seq 21474836527 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836527 -lt 21474836527 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673006 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673006 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673006 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673006' 2026-03-08T23:03:32.411 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.1 seq 42949673006 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stah_pg_stats: seq=171798691877 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691877 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441 5-446676598787 6-150323855398 7-171798691877' 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528355 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528355 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441 5-446676598787 6-150323855398 7-171798691877 8-193273528355' 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795016 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795016 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836526 1-42949673005 2-64424509483 3-446676598787 4-107374182441 5-446676598787 6-150323855398 7-171798691877 8-193273528355 9-420906795016' 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836526 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836526 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836526 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836526' 2026-03-08T23:03:32.439 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.0 seq 21474836526 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836527 -lt 21474836526 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673005 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673005 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673005 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673005' 2026-03-08T23:03:32.440 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.1 seq 42949673005 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pgts: ceph osd last-stat-seq 1 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673006 -lt 42949673006 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509484 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509484 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509484 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509484' 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.2 seq 64424509484 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509484 -lt 64424509484 2026-03-08T23:03:33.174 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-446676598786 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-446676598786 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598786 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 446676598786' 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.3 seq 446676598786 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598786 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182440 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182440 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182440 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182440' 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.4 seq 107374182440 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182440 -lt 107374182440 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:33.175 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.187 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/c_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:34.187 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673006 -lt 42949673005 2026-03-08T23:03:34.187 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.187 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.187 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509483 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509483 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509483 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509483' 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.2 seq 64424509483 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509484 -lt 64424509483 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-446676598787 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-446676598787 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598787 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 446676598787' 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.3 seq 446676598787 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598787 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182441 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182441 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182441 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182441' 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.4 seq 107374182441 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182440 -lt 107374182441 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:34.188 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 446676598786' 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.5 seq 446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598788 -lt 446676598786 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855397 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855397 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855397 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855397' 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.6 seq 150323855397 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855398 -lt 150323855397 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691876 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691876 2026-03-08T23:03:34.914 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691876 2026-03-08T23:03:34.915 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691876' 2026-03-08T23:03:34.915 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.7 seq 171798691876 2026-03-08T23:03:34.915 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:34.915 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691877 -lt 171798691876 2026-03-08T23:03:34.915 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_plone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182442 -lt 107374182441 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-446676598787 2026-03-08T23:03:34.920 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-446676598787 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598787 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 446676598787' 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.5 seq 446676598787 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598788 -lt 446676598787 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855398 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855398 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855398 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855398' 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.6 seq 150323855398 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855398 -lt 150323855398 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691877 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691877 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691877 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691877' 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.7 seq 171798691877 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691877 -lt 171798691877 2026-03-08T23:03:34.921 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:35.741 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sg_stats: cut -d - -f 1 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528354 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528354 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528354 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528354' 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.8 seq 193273528354 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528356 -lt 193273528354 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795015 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795015 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795015 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795015' 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: waiting osd.9 seq 420906795015 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795016 -lt 420906795015 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:35.742 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:1h:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528355 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528355 2026-03-08T23:03:35.743 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528355 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528355' 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.8 seq 193273528355 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528356 -lt 193273528355 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795016 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795016 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795016 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795016' 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: waiting osd.9 seq 420906795016 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795016 -lt 420906795016 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:35.744 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:35.976 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.nu90856: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:35.977 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:35.977 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:35.977 INFO:tasks.workunit.client.0.vm03.stderr:190856: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:m_pgs 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:190854: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:190854: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 190853 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:03:36.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T23:03:36.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:03:36.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:03:36.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:03:36.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:36.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:36.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:36.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:03:36.543 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:36.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:03:05.265770+0000 '>' 2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:36.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:37.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:37.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:37.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:37.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:37.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:37.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:37.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:37.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:03:05.265770+0000 '>' 2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:37.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:38.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:03:05.265770+0000 '>' 2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:39.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:03:40.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:03:40.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:03:37.253191+0000 '>' 2026-03-08T23:03:05.265770+0000 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 194330"' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 194330' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 194331"' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 194331' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 194330 194331' 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 194330 2026-03-08T23:03:40.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/194333: /' 2026-03-08T23:03:40.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/194335: /' 2026-03-08T23:03:40.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:03:40.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING list-attrs 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: _ 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: hinfo_key 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: snapset 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:03:41.078 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: _ 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: hinfo_key 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: snapset 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:03:41.079 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' s.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:41.081 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194335: start osd.5 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log--osd-failsafe-full-ratio=.99' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: start osd.3 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:03:41.082 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:03:45.008 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2026-03-08T23:03:41.130+0000 7f968b4188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2026-03-08T23:03:41.138+0000 7f968b4188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2026-03-08T23:03:41.146+0000 7f968b4188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 0 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2026-03-08T23:03:42.598+0000 7f968b4188c0 -1 Falling back to public interface 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2026-03-08T23:03:44.014+0000 7f968b4188c0 -1 osd.3 105 log_to_monitors true 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: 3 2026-03-08T23:03:45.009 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpe-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2026-03-08T23:03:41.126+0000 7f6f576a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2026-03-08T23:03:41.138+0000 7f6f576a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2026-03-08T23:03:41.146+0000 7f6f576a68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 0 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2026-03-08T23:03:41.622+0000 7f6f576a68c0 -1 Falling back to public interface 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 1 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2026-03-08T23:03:43.142+0000 7f6f576a68c0 -1 osd.5 105 log_to_monitors true 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2 2026-03-08T23:03:45.268 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: 3 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:03:45.269 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:45.707 INFO:tasks.workunit.client.0.vm03.stderr:194335: osd.5 up in weight 1 up_from 110 up_thru 110 dowrs.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: osd.3 up in weight 1 up_from 112 up_thru 112 down_at 106 last_clean_interval [104,105) [v2:127.0.0.1:6874/2761941111,v1:127.0.0.1:6875/2761941111] [v2:127.0.0.1:6876/2761941111,v1:127.0.0.1:6877/2761941111] exists,up df112e4e-0f41-42b6-ad94-9c02d4399d65 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 1 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 2 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 3 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 4 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 5 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 6 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 7 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 8 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: 9' 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836531 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836531 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531' 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673010 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673010 2026-03-08T23:03:45.708 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010' 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0n_at 106 last_clean_interval [104,105) [v2:127.0.0.1:6826/3345009935,v1:127.0.0.1:6827/3345009935] [v2:127.0.0.1:6828/3345009935,v1:127.0.0.1:6829/3345009935] exists,up 0c5fe5be-089f-445b-a798-38a5fe8df624 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:03:45.717 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 1 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 2 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 3 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 4 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 5 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 6 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 7 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 8 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: 9' 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836532 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836532 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532' 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673011 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673011 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011' 2026-03-08T23:03:45.718 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest//qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509488 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509488 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488' 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=481036337154 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 481036337154 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154' 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182445 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182445 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445' 2026-03-08T23:03:46.353 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=472446402562 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 472446402562 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445 5-472446402562' 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855402 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855402 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445 5-472446402562 6-150323855402' 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691881 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691881 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445 5-472446402562 6-150323855402 7-171798691881' 2026-03-08T23:03:46.354 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qclone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509489 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509489 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489' 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=481036337155 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 481036337155 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155' 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182446 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182446 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446' 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=472446402563 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 472446402563 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446 5-472446402563' 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855403 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855403 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446 5-472446402563 6-150323855403' 2026-03-08T23:03:46.483 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691882 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691882 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446 5-472446402563 6-150323855403 7-171798691882' 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.484 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuna/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528359 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528359 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445 5-472446402562 6-150323855402 7-171798691881 8-193273528359' 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795020 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795020 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836531 1-42949673010 2-64424509488 3-481036337154 4-107374182445 5-472446402562 6-150323855402 7-171798691881 8-193273528359 9-420906795020' 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836531 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836531 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:46.972 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836531 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836531' 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.0 seq 21474836531 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836532 -lt 21474836531 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673010 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673010 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673010 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673010' 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.1 seq 42949673010 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673011 -lt 42949673010 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:46.973 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509488 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/ceptu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528360 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528360 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446 5-472446402563 6-150323855403 7-171798691882 8-193273528360' 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795021 2026-03-08T23:03:47.033 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795021 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836532 1-42949673011 2-64424509489 3-481036337155 4-107374182446 5-472446402563 6-150323855403 7-171798691882 8-193273528360 9-420906795021' 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836532 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836532 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836532 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836532' 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.0 seq 21474836532 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836532 -lt 21474836532 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673011 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673011 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673011 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673011' 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.1 seq 42949673011 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673011 -lt 42949673011 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509489 2026-03-08T23:03:47.034 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntuhtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509488 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509488 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509488' 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.2 seq 64424509488 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509487 -lt 64424509488 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509490 -lt 64424509488 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-481036337154 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-481036337154 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=481036337154 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 481036337154' 2026-03-08T23:03:48.839 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.3 seq 481036337154 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 481036337155 -lt 481036337154 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182445 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182445 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182445 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182445' 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.4 seq 107374182445 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182447 -lt 107374182445 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.840 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: fl/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509489 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509489 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509489' 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.2 seq 64424509489 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509487 -lt 64424509489 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509490 -lt 64424509489 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-481036337155 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-481036337155 2026-03-08T23:03:48.846 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=481036337155 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 481036337155' 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.3 seq 481036337155 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 481036337155 -lt 481036337155 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182446 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182446 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182446 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182446' 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.4 seq 107374182446 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182447 -lt 107374182446 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:48.847 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-472446402563 2026-03-08T23:03:49.529 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-heush_pg_stats: echo 5-472446402562 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-472446402562 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=472446402562 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 472446402562' 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.5 seq 472446402562 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 472446402563 -lt 472446402562 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855402 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855402 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855402 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855402' 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.6 seq 150323855402 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855403 -lt 150323855402 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691881 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691881 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691881 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691881' 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.7 seq 171798691881 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691882 -lt 171798691881 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528359 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.530 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528359 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.clienlpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-472446402563 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=472446402563 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 472446402563' 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.5 seq 472446402563 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 472446402563 -lt 472446402563 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-150323855403 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-150323855403 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855403 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 150323855403' 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.6 seq 150323855403 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855403 -lt 150323855403 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-171798691882 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-171798691882 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691882 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 171798691882' 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.7 seq 171798691882 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691882 -lt 171798691882 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-193273528360 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-193273528360 2026-03-08T23:03:49.558 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528360 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtet.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528359 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528359' 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.8 seq 193273528359 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528360 -lt 193273528359 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795020 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795020 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795020 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795020' 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: waiting osd.9 seq 420906795020 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795021 -lt 420906795020 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:194333: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:03:50.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 194331 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:st/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 193273528360' 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.8 seq 193273528360 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528361 -lt 193273528360 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-420906795021 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-420906795021 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795021 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 420906795021' 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: waiting osd.9 seq 420906795021 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795021 -lt 420906795021 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:194335: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:03:50.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:03:50.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:03:50.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:50.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:50.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:50.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:50.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:50.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:50.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:03:50.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:03:50.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:03:50.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:03:50.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:03:50.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:03:50.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:03:50.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:03:50.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:03:50.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:03:50.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:03:50.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:03:50.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:03:50.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:03:50.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:50.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:50.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:03:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:03:50.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:03:50.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:03:50.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:03:50.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:03:50.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:03:50.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:03:50.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:03:50.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:03:50.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:03:50.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:03:50.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:03:50.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:03:50.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:03:50.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:03:50.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:03:50.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:03:50.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:03:50.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:03:50.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:03:50.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:03:50.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:50.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:50.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:03:50.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:03:50.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:03:50.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:03:50.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:03:50.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:50.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:50.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:03:50.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:03:50.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_and_repair_lrc_overwrites td/osd-scrub-repair 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:846: TEST_corrupt_and_repair_lrc_overwrites: '[' true = true ']' 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:847: TEST_corrupt_and_repair_lrc_overwrites: corrupt_and_repair_lrc td/osd-scrub-repair true 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:825: corrupt_and_repair_lrc: local dir=td/osd-scrub-repair 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:826: corrupt_and_repair_lrc: local allow_overwrites=true 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:827: corrupt_and_repair_lrc: local poolname=ecpool 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:829: corrupt_and_repair_lrc: run_mon td/osd-scrub-repair a 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:03:50.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:50.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:50.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:03:50.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:03:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:03:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:03:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:03:50.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:03:50.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:03:50.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:51.065 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:51.066 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:03:51.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:03:51.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:830: corrupt_and_repair_lrc: run_mgr td/osd-scrub-repair x 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:03:51.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:51.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:03:51.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:03:51.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: seq 0 9 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 0 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:03:51.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:03:51.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:03:51.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=82b72473-cb22-4e38-a17b-dd78237c9918 2026-03-08T23:03:51.270 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 82b72473-cb22-4e38-a17b-dd78237c9918 2026-03-08T23:03:51.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 82b72473-cb22-4e38-a17b-dd78237c9918' 2026-03-08T23:03:51.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:03:51.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBXAK5pd141ERAAUNfqVbHgctpP1lhRL9ntDA== 2026-03-08T23:03:51.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBXAK5pd141ERAAUNfqVbHgctpP1lhRL9ntDA=="}' 2026-03-08T23:03:51.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 82b72473-cb22-4e38-a17b-dd78237c9918 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:03:51.388 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:03:51.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:03:51.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBXAK5pd141ERAAUNfqVbHgctpP1lhRL9ntDA== --osd-uuid 82b72473-cb22-4e38-a17b-dd78237c9918 2026-03-08T23:03:51.416 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:51.418+0000 7fe6624448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:51.417 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:51.422+0000 7fe6624448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:51.419 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:51.422+0000 7fe6624448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:51.419 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:51.422+0000 7fe6624448c0 -1 bdev(0x56303381ac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:03:51.419 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:51.422+0000 7fe6624448c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:03:53.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:03:53.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:03:53.683 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:03:53.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:03:53.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:03:53.792 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:03:53.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:03:53.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:03:53.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:03:53.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:03:53.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:03:53.839 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:53.834+0000 7fed4ecf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:53.845 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:53.850+0000 7fed4ecf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:53.862 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:53.862+0000 7fed4ecf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:53.919 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:03:53.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:03:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:54.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:55.024 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:03:55.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:55.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:55.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:03:55.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:55.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:03:55.074 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:55.078+0000 7fed4ecf88c0 -1 Falling back to public interface 2026-03-08T23:03:55.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:56.045 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:56.050+0000 7fed4ecf88c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:03:56.203 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:03:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:03:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:03:56.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:03:57.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/323748520,v1:127.0.0.1:6803/323748520] [v2:127.0.0.1:6804/323748520,v1:127.0.0.1:6805/323748520] exists,up 82b72473-cb22-4e38-a17b-dd78237c9918 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 1 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:03:57.551 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:03:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:03:57.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:03:57.554 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 6bfa7480-7f3d-4119-b4f3-b2e75203b698 2026-03-08T23:03:57.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=6bfa7480-7f3d-4119-b4f3-b2e75203b698 2026-03-08T23:03:57.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 6bfa7480-7f3d-4119-b4f3-b2e75203b698' 2026-03-08T23:03:57.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:03:57.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBdAK5poTogIhAA2jyNidKEK9uURd06FqbtJA== 2026-03-08T23:03:57.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBdAK5poTogIhAA2jyNidKEK9uURd06FqbtJA=="}' 2026-03-08T23:03:57.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 6bfa7480-7f3d-4119-b4f3-b2e75203b698 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:03:57.728 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:03:57.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:03:57.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBdAK5poTogIhAA2jyNidKEK9uURd06FqbtJA== --osd-uuid 6bfa7480-7f3d-4119-b4f3-b2e75203b698 2026-03-08T23:03:57.760 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:57.762+0000 7efdde4c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:57.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:57.766+0000 7efdde4c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:57.762 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:57.766+0000 7efdde4c38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:03:57.763 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:57.766+0000 7efdde4c38c0 -1 bdev(0x559a439f5c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:03:57.763 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:03:57.766+0000 7efdde4c38c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:04:00.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:04:00.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:00.790 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:04:00.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:04:00.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:00.993 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:04:00.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:04:00.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:00.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:00.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:00.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:01.013 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:01.018+0000 7fd7c11438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:01.015 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:01.018+0000 7fd7c11438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:01.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:01.018+0000 7fd7c11438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:01.172 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:04:01.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:01.966 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:01.970+0000 7fd7c11438c0 -1 Falling back to public interface 2026-03-08T23:04:02.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:02.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:02.336 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:02.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:02.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:02.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:04:02.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:02.927 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:02.930+0000 7fd7c11438c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:04:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:04.707 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:04.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:04.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:04.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:04.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:04:04.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:04.878 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2724844676,v1:127.0.0.1:6811/2724844676] [v2:127.0.0.1:6812/2724844676,v1:127.0.0.1:6813/2724844676] exists,up 6bfa7480-7f3d-4119-b4f3-b2e75203b698 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 2 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:04.879 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:04.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:04:04.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:04.882 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 a06372a4-0da3-4836-a674-c364b96ce422 2026-03-08T23:04:04.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a06372a4-0da3-4836-a674-c364b96ce422 2026-03-08T23:04:04.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 a06372a4-0da3-4836-a674-c364b96ce422' 2026-03-08T23:04:04.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:04.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBkAK5p4k2yNRAAU1W+OQ+k5RcE8dPXQYzSgg== 2026-03-08T23:04:04.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBkAK5p4k2yNRAAU1W+OQ+k5RcE8dPXQYzSgg=="}' 2026-03-08T23:04:04.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a06372a4-0da3-4836-a674-c364b96ce422 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:04:05.060 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:05.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:04:05.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBkAK5p4k2yNRAAU1W+OQ+k5RcE8dPXQYzSgg== --osd-uuid a06372a4-0da3-4836-a674-c364b96ce422 2026-03-08T23:04:05.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:05.098+0000 7f08979a78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:05.095 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:05.098+0000 7f08979a78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:05.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:05.098+0000 7f08979a78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:05.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:05.098+0000 7f08979a78c0 -1 bdev(0x55966f397c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:05.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:05.098+0000 7f08979a78c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:04:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:04:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:07.487 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:04:07.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:04:07.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:07.705 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:04:07.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:04:07.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:07.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:07.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:07.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:07.723 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:07.726+0000 7f0215e9e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:07.723 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:07.726+0000 7f0215e9e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:07.725 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:07.726+0000 7f0215e9e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:07.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:04:08.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:08.682 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:08.686+0000 7f0215e9e8c0 -1 Falling back to public interface 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:04:09.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:09.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:09.894 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:09.898+0000 7f0215e9e8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:04:10.237 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:10.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:10.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:10.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:10.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:10.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:04:10.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:10.998 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.002+0000 7f0211657640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1105380194,v1:127.0.0.1:6819/1105380194] [v2:127.0.0.1:6820/1105380194,v1:127.0.0.1:6821/1105380194] exists,up a06372a4-0da3-4836-a674-c364b96ce422 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 3 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:11.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:11.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:11.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:04:11.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:11.638 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:04:11.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:04:11.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f' 2026-03-08T23:04:11.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:11.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBrAK5plYQdJxAAApgNc0eYl2cSgVBlLEkvKQ== 2026-03-08T23:04:11.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBrAK5plYQdJxAAApgNc0eYl2cSgVBlLEkvKQ=="}' 2026-03-08T23:04:11.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f -i td/osd-scrub-repair/3/new.json 2026-03-08T23:04:11.813 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:11.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T23:04:11.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBrAK5plYQdJxAAApgNc0eYl2cSgVBlLEkvKQ== --osd-uuid 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:04:11.845 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.850+0000 7fd0800408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:11.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.850+0000 7fd0800408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:11.848 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.850+0000 7fd0800408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:11.849 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.854+0000 7fd0800408c0 -1 bdev(0x5558c5f21c00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:11.849 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:11.854+0000 7fd0800408c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T23:04:14.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T23:04:14.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:14.354 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T23:04:14.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:04:14.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:14.557 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T23:04:14.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:04:14.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:14.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:14.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:14.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:14.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:14.574+0000 7f7edf0178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:14.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:14.582+0000 7f7edf0178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:14.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:14.582+0000 7f7edf0178c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:14.732 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:14.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:04:14.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:14.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:14.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:04:14.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:15.038 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:15.042+0000 7f7edf0178c0 -1 Falling back to public interface 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:15.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:04:16.022 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:16.026+0000 7f7edf0178c0 -1 osd.3 0 log_to_monitors true 2026-03-08T23:04:16.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:17.094 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:17.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:04:17.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/3961838415,v1:127.0.0.1:6827/3961838415] [v2:127.0.0.1:6828/3961838415,v1:127.0.0.1:6829/3961838415] exists,up 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 4 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=4 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/4 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/4' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/4/journal' 2026-03-08T23:04:17.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:17.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:17.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:17.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:17.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:17.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:17.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/4 2026-03-08T23:04:17.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:17.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=01602397-6869-43c5-a76b-cb064608ac29 2026-03-08T23:04:17.284 INFO:tasks.workunit.client.0.vm03.stdout:add osd4 01602397-6869-43c5-a76b-cb064608ac29 2026-03-08T23:04:17.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd4 01602397-6869-43c5-a76b-cb064608ac29' 2026-03-08T23:04:17.284 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:17.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBxAK5p/kEIEhAAiJJ2ZHhQ2lZsRAG3qDfBKg== 2026-03-08T23:04:17.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBxAK5p/kEIEhAAiJJ2ZHhQ2lZsRAG3qDfBKg=="}' 2026-03-08T23:04:17.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 01602397-6869-43c5-a76b-cb064608ac29 -i td/osd-scrub-repair/4/new.json 2026-03-08T23:04:17.495 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:04:17.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/4/new.json 2026-03-08T23:04:17.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 4 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/4 --osd-journal=td/osd-scrub-repair/4/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBxAK5p/kEIEhAAiJJ2ZHhQ2lZsRAG3qDfBKg== --osd-uuid 01602397-6869-43c5-a76b-cb064608ac29 2026-03-08T23:04:17.524 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:17.526+0000 7fc4874528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:17.577 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:17.582+0000 7fc4874528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:17.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:17.582+0000 7fc4874528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:17.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:17.582+0000 7fc4874528c0 -1 bdev(0x560dc1131c00 td/osd-scrub-repair/4/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:17.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:17.582+0000 7fc4874528c0 -1 bluestore(td/osd-scrub-repair/4) _read_fsid unparsable uuid 2026-03-08T23:04:19.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/4/keyring 2026-03-08T23:04:19.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:19.970 INFO:tasks.workunit.client.0.vm03.stdout:adding osd4 key to auth repository 2026-03-08T23:04:19.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd4 key to auth repository 2026-03-08T23:04:19.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/4/keyring auth add osd.4 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:20.178 INFO:tasks.workunit.client.0.vm03.stdout:start osd.4 2026-03-08T23:04:20.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.4 2026-03-08T23:04:20.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 4 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/4 --osd-journal=td/osd-scrub-repair/4/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:20.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:20.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:20.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:20.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:20.198+0000 7fce8825a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:20.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:20.198+0000 7fce8825a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:20.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:20.198+0000 7fce8825a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 4 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=4 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:20.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:04:20.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:21.398 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:21.402+0000 7fce8825a8c0 -1 Falling back to public interface 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:21.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:04:21.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:22.352 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:22.354+0000 7fce8825a8c0 -1 osd.4 0 log_to_monitors true 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:04:22.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:22.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:23.830 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:23.834+0000 7fce83a13640 -1 osd.4 0 waiting for initial osdmap 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:23.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.4 up' 2026-03-08T23:04:24.115 INFO:tasks.workunit.client.0.vm03.stdout:osd.4 up in weight 1 up_from 25 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6834/2699152069,v1:127.0.0.1:6835/2699152069] [v2:127.0.0.1:6836/2699152069,v1:127.0.0.1:6837/2699152069] exists,up 01602397-6869-43c5-a76b-cb064608ac29 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 5 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=5 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:24.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:04:24.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:24.118 INFO:tasks.workunit.client.0.vm03.stdout:add osd5 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:04:24.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:04:24.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd5 8d6c8677-0ab7-4e74-91ad-7f9acd06e683' 2026-03-08T23:04:24.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:24.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB4AK5psQArCBAAa6bkalUmX5p8MpyxQc2bLg== 2026-03-08T23:04:24.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB4AK5psQArCBAAa6bkalUmX5p8MpyxQc2bLg=="}' 2026-03-08T23:04:24.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 -i td/osd-scrub-repair/5/new.json 2026-03-08T23:04:24.304 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:04:24.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/5/new.json 2026-03-08T23:04:24.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB4AK5psQArCBAAa6bkalUmX5p8MpyxQc2bLg== --osd-uuid 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:04:24.336 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:24.338+0000 7fade31d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:24.338 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:24.342+0000 7fade31d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:24.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:24.342+0000 7fade31d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:24.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:24.342+0000 7fade31d58c0 -1 bdev(0x55c6da6d7c00 td/osd-scrub-repair/5/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:24.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:24.342+0000 7fade31d58c0 -1 bluestore(td/osd-scrub-repair/5) _read_fsid unparsable uuid 2026-03-08T23:04:26.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/5/keyring 2026-03-08T23:04:26.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:26.842 INFO:tasks.workunit.client.0.vm03.stdout:adding osd5 key to auth repository 2026-03-08T23:04:26.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd5 key to auth repository 2026-03-08T23:04:26.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/5/keyring auth add osd.5 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:27.041 INFO:tasks.workunit.client.0.vm03.stdout:start osd.5 2026-03-08T23:04:27.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.5 2026-03-08T23:04:27.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:27.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:27.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:27.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:27.057 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:27.058+0000 7f485fc918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:27.065 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:27.070+0000 7f485fc918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:27.067 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:27.070+0000 7f485fc918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:27.219 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 5 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:27.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:04:27.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:28.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:04:28.518 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:28.522+0000 7f485fc918c0 -1 Falling back to public interface 2026-03-08T23:04:28.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:29.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:04:29.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:29.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:29.738+0000 7f485fc918c0 -1 osd.5 0 log_to_monitors true 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:30.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:04:30.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:31.562 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:31.566+0000 7f485b44a640 -1 osd.5 0 waiting for initial osdmap 2026-03-08T23:04:31.905 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:04:31.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:31.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:31.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:04:31.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:04:31.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:32.215 INFO:tasks.workunit.client.0.vm03.stdout:osd.5 up in weight 1 up_from 29 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6842/3186645677,v1:127.0.0.1:6843/3186645677] [v2:127.0.0.1:6844/3186645677,v1:127.0.0.1:6845/3186645677] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:04:32.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 6 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=6 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/6 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/6' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/6/journal' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:32.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/6 2026-03-08T23:04:32.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:32.218 INFO:tasks.workunit.client.0.vm03.stdout:add osd6 2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a 2026-03-08T23:04:32.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a 2026-03-08T23:04:32.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd6 2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a' 2026-03-08T23:04:32.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:32.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCAAK5plnkIDhAAVREtOkUEH40j8dF0asiGtg== 2026-03-08T23:04:32.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCAAK5plnkIDhAAVREtOkUEH40j8dF0asiGtg=="}' 2026-03-08T23:04:32.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a -i td/osd-scrub-repair/6/new.json 2026-03-08T23:04:32.926 INFO:tasks.workunit.client.0.vm03.stdout:6 2026-03-08T23:04:32.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/6/new.json 2026-03-08T23:04:32.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 6 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/6 --osd-journal=td/osd-scrub-repair/6/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCAAK5plnkIDhAAVREtOkUEH40j8dF0asiGtg== --osd-uuid 2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a 2026-03-08T23:04:32.960 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:32.962+0000 7f962aa8a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:32.962 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:32.966+0000 7f962aa8a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:32.963 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:32.966+0000 7f962aa8a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:32.964 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:32.966+0000 7f962aa8a8c0 -1 bdev(0x55bde0929c00 td/osd-scrub-repair/6/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:32.964 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:32.966+0000 7f962aa8a8c0 -1 bluestore(td/osd-scrub-repair/6) _read_fsid unparsable uuid 2026-03-08T23:04:35.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/6/keyring 2026-03-08T23:04:35.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:35.255 INFO:tasks.workunit.client.0.vm03.stdout:adding osd6 key to auth repository 2026-03-08T23:04:35.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd6 key to auth repository 2026-03-08T23:04:35.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/6/keyring auth add osd.6 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:35.467 INFO:tasks.workunit.client.0.vm03.stdout:start osd.6 2026-03-08T23:04:35.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.6 2026-03-08T23:04:35.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 6 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/6 --osd-journal=td/osd-scrub-repair/6/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:35.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:35.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:35.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:35.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:35.486+0000 7f4c25fcc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:35.484 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:35.486+0000 7f4c25fcc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:35.485 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:35.490+0000 7f4c25fcc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:35.652 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 6 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=6 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:35.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:04:35.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:36.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:04:36.930 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:36.934+0000 7f4c25fcc8c0 -1 Falling back to public interface 2026-03-08T23:04:37.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:37.916 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:37.918+0000 7f4c25fcc8c0 -1 osd.6 0 log_to_monitors true 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:38.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:04:38.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:39.253 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:39.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:39.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:39.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:39.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:39.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.6 up' 2026-03-08T23:04:39.425 INFO:tasks.workunit.client.0.vm03.stdout:osd.6 up in weight 1 up_from 34 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6850/3797779309,v1:127.0.0.1:6851/3797779309] [v2:127.0.0.1:6852/3797779309,v1:127.0.0.1:6853/3797779309] exists,up 2af8ad3a-2b7a-4d5f-84ef-7a995b27d06a 2026-03-08T23:04:39.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:39.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:39.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:39.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 7 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=7 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/7 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/7' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/7/journal' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/7 2026-03-08T23:04:39.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:39.428 INFO:tasks.workunit.client.0.vm03.stdout:add osd7 ebc11c24-30fd-4e5d-9446-9ba870b3b429 2026-03-08T23:04:39.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ebc11c24-30fd-4e5d-9446-9ba870b3b429 2026-03-08T23:04:39.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd7 ebc11c24-30fd-4e5d-9446-9ba870b3b429' 2026-03-08T23:04:39.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:39.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCHAK5pW364GhAAnWk0FXROogCJweXydlhGZg== 2026-03-08T23:04:39.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCHAK5pW364GhAAnWk0FXROogCJweXydlhGZg=="}' 2026-03-08T23:04:39.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ebc11c24-30fd-4e5d-9446-9ba870b3b429 -i td/osd-scrub-repair/7/new.json 2026-03-08T23:04:39.641 INFO:tasks.workunit.client.0.vm03.stdout:7 2026-03-08T23:04:39.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/7/new.json 2026-03-08T23:04:39.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 7 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/7 --osd-journal=td/osd-scrub-repair/7/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCHAK5pW364GhAAnWk0FXROogCJweXydlhGZg== --osd-uuid ebc11c24-30fd-4e5d-9446-9ba870b3b429 2026-03-08T23:04:39.675 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:39.678+0000 7fddeb0fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:39.677 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:39.678+0000 7fddeb0fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:39.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:39.682+0000 7fddeb0fe8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:39.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:39.682+0000 7fddeb0fe8c0 -1 bdev(0x562d21c41c00 td/osd-scrub-repair/7/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:39.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:39.682+0000 7fddeb0fe8c0 -1 bluestore(td/osd-scrub-repair/7) _read_fsid unparsable uuid 2026-03-08T23:04:42.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/7/keyring 2026-03-08T23:04:42.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:42.439 INFO:tasks.workunit.client.0.vm03.stdout:adding osd7 key to auth repository 2026-03-08T23:04:42.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd7 key to auth repository 2026-03-08T23:04:42.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/7/keyring auth add osd.7 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:42.642 INFO:tasks.workunit.client.0.vm03.stdout:start osd.7 2026-03-08T23:04:42.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.7 2026-03-08T23:04:42.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 7 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/7 --osd-journal=td/osd-scrub-repair/7/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:42.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:42.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:42.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:42.658 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:42.658+0000 7fa9a30c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:42.661 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:42.666+0000 7fa9a30c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:42.663 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:42.666+0000 7fa9a30c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:42.821 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:42.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 7 2026-03-08T23:04:42.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:42.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=7 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:42.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:04:43.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:43.366 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:43.370+0000 7fa9a30c88c0 -1 Falling back to public interface 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:04:44.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:44.601 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:44.606+0000 7fa9a30c88c0 -1 osd.7 0 log_to_monitors true 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:04:45.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:46.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:04:46.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:47.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:47.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:47.602 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:04:47.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:04:47.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:47.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.7 up' 2026-03-08T23:04:47.795 INFO:tasks.workunit.client.0.vm03.stdout:osd.7 up in weight 1 up_from 39 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6858/1540728232,v1:127.0.0.1:6859/1540728232] [v2:127.0.0.1:6860/1540728232,v1:127.0.0.1:6861/1540728232] exists,up ebc11c24-30fd-4e5d-9446-9ba870b3b429 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 8 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=8 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/8 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/8' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/8/journal' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:47.796 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:47.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/8 2026-03-08T23:04:47.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:47.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=aae95067-e887-4335-9ae8-6b04ed02726c 2026-03-08T23:04:47.799 INFO:tasks.workunit.client.0.vm03.stdout:add osd8 aae95067-e887-4335-9ae8-6b04ed02726c 2026-03-08T23:04:47.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd8 aae95067-e887-4335-9ae8-6b04ed02726c' 2026-03-08T23:04:47.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:47.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCPAK5pMpC8MBAAuQu4rplOVim/3Rq1QN/gqg== 2026-03-08T23:04:47.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCPAK5pMpC8MBAAuQu4rplOVim/3Rq1QN/gqg=="}' 2026-03-08T23:04:47.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new aae95067-e887-4335-9ae8-6b04ed02726c -i td/osd-scrub-repair/8/new.json 2026-03-08T23:04:47.979 INFO:tasks.workunit.client.0.vm03.stdout:8 2026-03-08T23:04:47.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/8/new.json 2026-03-08T23:04:47.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 8 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/8 --osd-journal=td/osd-scrub-repair/8/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCPAK5pMpC8MBAAuQu4rplOVim/3Rq1QN/gqg== --osd-uuid aae95067-e887-4335-9ae8-6b04ed02726c 2026-03-08T23:04:48.012 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:48.014+0000 7f2752e508c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:48.015 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:48.018+0000 7f2752e508c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:48.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:48.022+0000 7f2752e508c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:48.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:48.022+0000 7f2752e508c0 -1 bdev(0x55c22609fc00 td/osd-scrub-repair/8/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:48.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:48.022+0000 7f2752e508c0 -1 bluestore(td/osd-scrub-repair/8) _read_fsid unparsable uuid 2026-03-08T23:04:50.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/8/keyring 2026-03-08T23:04:50.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:04:50.310 INFO:tasks.workunit.client.0.vm03.stdout:adding osd8 key to auth repository 2026-03-08T23:04:50.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd8 key to auth repository 2026-03-08T23:04:50.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/8/keyring auth add osd.8 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:04:50.518 INFO:tasks.workunit.client.0.vm03.stdout:start osd.8 2026-03-08T23:04:50.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.8 2026-03-08T23:04:50.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 8 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/8 --osd-journal=td/osd-scrub-repair/8/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:04:50.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:04:50.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:04:50.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:04:50.535 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:50.538+0000 7f93e3cce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:50.535 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:50.538+0000 7f93e3cce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:50.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:50.538+0000 7f93e3cce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:50.700 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 8 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=8 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:50.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:50.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:51.502 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:51.506+0000 7f93e3cce8c0 -1 Falling back to public interface 2026-03-08T23:04:51.884 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:04:51.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:51.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:51.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:04:51.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:51.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:52.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:52.513 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:52.514+0000 7f93e3cce8c0 -1 osd.8 0 log_to_monitors true 2026-03-08T23:04:53.070 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:04:53.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:53.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:53.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:04:53.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:53.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:53.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:54.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:54.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:55.663 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:04:55.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:55.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:55.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:04:55.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:55.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:56.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:04:57.008 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:04:57.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:04:57.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:04:57.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 5 2026-03-08T23:04:57.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:04:57.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.8 up' 2026-03-08T23:04:57.183 INFO:tasks.workunit.client.0.vm03.stdout:osd.8 up in weight 1 up_from 44 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6866/2399710432,v1:127.0.0.1:6867/2399710432] [v2:127.0.0.1:6868/2399710432,v1:127.0.0.1:6869/2399710432] exists,up aae95067-e887-4335-9ae8-6b04ed02726c 2026-03-08T23:04:57.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:831: corrupt_and_repair_lrc: for id in $(seq 0 9) 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:832: corrupt_and_repair_lrc: run_osd td/osd-scrub-repair 9 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=9 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:04:57.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:04:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:04:57.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:04:57.187 INFO:tasks.workunit.client.0.vm03.stdout:add osd9 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:04:57.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:04:57.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd9 1123dc8d-0537-43cf-bce4-3877782ccfc1' 2026-03-08T23:04:57.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:04:57.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCZAK5p2201DBAA8+NHysvqAnGUTzxq1D/Z8A== 2026-03-08T23:04:57.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCZAK5p2201DBAA8+NHysvqAnGUTzxq1D/Z8A=="}' 2026-03-08T23:04:57.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1123dc8d-0537-43cf-bce4-3877782ccfc1 -i td/osd-scrub-repair/9/new.json 2026-03-08T23:04:57.427 INFO:tasks.workunit.client.0.vm03.stdout:9 2026-03-08T23:04:57.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/9/new.json 2026-03-08T23:04:57.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCZAK5p2201DBAA8+NHysvqAnGUTzxq1D/Z8A== --osd-uuid 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:04:57.464 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:57.466+0000 7f97511928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:57.466 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:57.470+0000 7f97511928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:57.467 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:57.470+0000 7f97511928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:04:57.468 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:57.470+0000 7f97511928c0 -1 bdev(0x558501817c00 td/osd-scrub-repair/9/block) open stat got: (1) Operation not permitted 2026-03-08T23:04:57.468 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:04:57.470+0000 7f97511928c0 -1 bluestore(td/osd-scrub-repair/9) _read_fsid unparsable uuid 2026-03-08T23:05:01.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/9/keyring 2026-03-08T23:05:01.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:05:01.247 INFO:tasks.workunit.client.0.vm03.stdout:adding osd9 key to auth repository 2026-03-08T23:05:01.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd9 key to auth repository 2026-03-08T23:05:01.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/9/keyring auth add osd.9 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:05:01.457 INFO:tasks.workunit.client.0.vm03.stdout:start osd.9 2026-03-08T23:05:01.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.9 2026-03-08T23:05:01.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:05:01.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:05:01.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:05:01.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:05:01.479 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:01.478+0000 7f6e3072f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:01.479 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:01.482+0000 7f6e3072f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:01.482 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:01.482+0000 7f6e3072f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 9 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:05:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:01.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:05:01.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:01.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:01.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:01.930 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:01.934+0000 7f6e3072f8c0 -1 Falling back to public interface 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:02.923 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:02.926+0000 7f6e3072f8c0 -1 osd.9 0 log_to_monitors true 2026-03-08T23:05:03.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:04.033 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:05:04.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:04.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:04.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:05:04.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:04.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:04.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:05.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stdout:osd.9 up in weight 1 up_from 49 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6874/3561693313,v1:127.0.0.1:6875/3561693313] [v2:127.0.0.1:6876/3561693313,v1:127.0.0.1:6877/3561693313] exists,up 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:834: corrupt_and_repair_lrc: create_rbd_pool 2026-03-08T23:05:05.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:05:05.656 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:05:05.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:05:05.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:05:05.902 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:05:05.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:05:06.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:05:07.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:835: corrupt_and_repair_lrc: wait_for_clean 2026-03-08T23:05:07.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:05:07.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:05:07.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:05:07.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:05:07.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:05:07.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836495 2026-03-08T23:05:07.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836495 2026-03-08T23:05:07.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495' 2026-03-08T23:05:07.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:05:07.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T23:05:07.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T23:05:07.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974' 2026-03-08T23:05:07.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:05:07.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509453 2026-03-08T23:05:07.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509453 2026-03-08T23:05:07.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453' 2026-03-08T23:05:07.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:05:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345932 2026-03-08T23:05:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345932 2026-03-08T23:05:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932' 2026-03-08T23:05:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:05:07.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182410 2026-03-08T23:05:07.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182410 2026-03-08T23:05:07.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410' 2026-03-08T23:05:07.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:05:07.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051593 2026-03-08T23:05:07.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051593 2026-03-08T23:05:07.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410 5-124554051593' 2026-03-08T23:05:07.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:07.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:05:08.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888071 2026-03-08T23:05:08.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888071 2026-03-08T23:05:08.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410 5-124554051593 6-146028888071' 2026-03-08T23:05:08.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:08.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:05:08.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724550 2026-03-08T23:05:08.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724550 2026-03-08T23:05:08.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410 5-124554051593 6-146028888071 7-167503724550' 2026-03-08T23:05:08.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:08.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:05:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561028 2026-03-08T23:05:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561028 2026-03-08T23:05:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410 5-124554051593 6-146028888071 7-167503724550 8-188978561028' 2026-03-08T23:05:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:08.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:05:08.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=210453397506 2026-03-08T23:05:08.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 210453397506 2026-03-08T23:05:08.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672974 2-64424509453 3-85899345932 4-107374182410 5-124554051593 6-146028888071 7-167503724550 8-188978561028 9-210453397506' 2026-03-08T23:05:08.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:08.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836495 2026-03-08T23:05:08.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:08.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:05:08.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:08.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836495 2026-03-08T23:05:08.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836495 2026-03-08T23:05:08.342 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836495 2026-03-08T23:05:08.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836495' 2026-03-08T23:05:08.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:08.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836495 2026-03-08T23:05:08.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:08.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T23:05:08.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:08.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:05:08.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T23:05:08.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:08.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T23:05:08.525 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672974 2026-03-08T23:05:08.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T23:05:08.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:08.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672974 -lt 42949672974 2026-03-08T23:05:08.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:08.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509453 2026-03-08T23:05:08.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:08.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:05:08.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509453 2026-03-08T23:05:08.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:08.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509453 2026-03-08T23:05:08.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509453' 2026-03-08T23:05:08.713 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509453 2026-03-08T23:05:08.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509453 -lt 64424509453 2026-03-08T23:05:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:08.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345932 2026-03-08T23:05:08.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:08.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:05:08.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345932 2026-03-08T23:05:08.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:08.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345932 2026-03-08T23:05:08.889 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345932 2026-03-08T23:05:08.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345932' 2026-03-08T23:05:08.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:05:09.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345932 -lt 85899345932 2026-03-08T23:05:09.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:09.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182410 2026-03-08T23:05:09.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:09.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:05:09.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182410 2026-03-08T23:05:09.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:09.060 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.4 seq 107374182410 2026-03-08T23:05:09.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182410 2026-03-08T23:05:09.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182410' 2026-03-08T23:05:09.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:05:09.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182410 -lt 107374182410 2026-03-08T23:05:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:09.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-124554051593 2026-03-08T23:05:09.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:05:09.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-124554051593 2026-03-08T23:05:09.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:09.226 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.5 seq 124554051593 2026-03-08T23:05:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051593 2026-03-08T23:05:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 124554051593' 2026-03-08T23:05:09.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:05:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051593 -lt 124554051593 2026-03-08T23:05:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:09.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888071 2026-03-08T23:05:09.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:09.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:05:09.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888071 2026-03-08T23:05:09.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:09.393 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.6 seq 146028888071 2026-03-08T23:05:09.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888071 2026-03-08T23:05:09.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888071' 2026-03-08T23:05:09.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:05:09.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888071 -lt 146028888071 2026-03-08T23:05:09.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724550 2026-03-08T23:05:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:09.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:05:09.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724550 2026-03-08T23:05:09.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:09.568 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.7 seq 167503724550 2026-03-08T23:05:09.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724550 2026-03-08T23:05:09.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724550' 2026-03-08T23:05:09.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:09.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724549 -lt 167503724550 2026-03-08T23:05:09.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:05:10.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:05:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:10.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724550 -lt 167503724550 2026-03-08T23:05:10.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:10.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561028 2026-03-08T23:05:10.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:10.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:05:10.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561028 2026-03-08T23:05:10.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:10.917 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.8 seq 188978561028 2026-03-08T23:05:10.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561028 2026-03-08T23:05:10.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561028' 2026-03-08T23:05:10.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:05:11.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561028 -lt 188978561028 2026-03-08T23:05:11.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:11.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-210453397506 2026-03-08T23:05:11.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:11.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:05:11.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-210453397506 2026-03-08T23:05:11.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:11.101 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.9 seq 210453397506 2026-03-08T23:05:11.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=210453397506 2026-03-08T23:05:11.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 210453397506' 2026-03-08T23:05:11.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:05:11.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 210453397506 -lt 210453397506 2026-03-08T23:05:11.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:05:11.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:11.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:11.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:05:11.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:05:11.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:05:11.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:05:11.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:11.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:837: corrupt_and_repair_lrc: create_ec_pool ecpool true k=4 m=2 l=3 plugin=lrc 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:05:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=4 m=2 l=3 plugin=lrc 2026-03-08T23:05:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:05:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:05:12.422 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:05:12.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:05:13.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T23:05:13.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T23:05:13.675 INFO:tasks.workunit.client.0.vm03.stderr:set pool 2 allow_ec_overwrites to true 2026-03-08T23:05:13.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:05:13.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:05:13.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:05:13.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:05:13.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:05:13.692 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:05:13.692 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:05:13.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:05:13.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:05:13.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:05:13.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:13.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:05:14.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836498 2026-03-08T23:05:14.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836498 2026-03-08T23:05:14.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498' 2026-03-08T23:05:14.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:14.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:05:14.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672976 2026-03-08T23:05:14.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672976 2026-03-08T23:05:14.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976' 2026-03-08T23:05:14.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:14.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:05:14.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509455 2026-03-08T23:05:14.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509455 2026-03-08T23:05:14.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455' 2026-03-08T23:05:14.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:14.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:05:14.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345934 2026-03-08T23:05:14.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345934 2026-03-08T23:05:14.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934' 2026-03-08T23:05:14.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:14.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:05:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182413 2026-03-08T23:05:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182413 2026-03-08T23:05:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413' 2026-03-08T23:05:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:15.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:05:15.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051595 2026-03-08T23:05:15.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051595 2026-03-08T23:05:15.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413 5-124554051595' 2026-03-08T23:05:15.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:15.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:05:15.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888074 2026-03-08T23:05:15.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888074 2026-03-08T23:05:15.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413 5-124554051595 6-146028888074' 2026-03-08T23:05:15.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:15.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:05:15.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724552 2026-03-08T23:05:15.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724552 2026-03-08T23:05:15.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413 5-124554051595 6-146028888074 7-167503724552' 2026-03-08T23:05:15.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:15.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:05:15.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561030 2026-03-08T23:05:15.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561030 2026-03-08T23:05:15.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413 5-124554051595 6-146028888074 7-167503724552 8-188978561030' 2026-03-08T23:05:15.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:15.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=210453397508 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 210453397508 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498 1-42949672976 2-64424509455 3-85899345934 4-107374182413 5-124554051595 6-146028888074 7-167503724552 8-188978561030 9-210453397508' 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836498 2026-03-08T23:05:15.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:15.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:05:15.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836498 2026-03-08T23:05:15.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:15.409 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836498 2026-03-08T23:05:15.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836498 2026-03-08T23:05:15.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836498' 2026-03-08T23:05:15.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836498 2026-03-08T23:05:15.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:05:16.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:05:16.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:16.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836498 -lt 21474836498 2026-03-08T23:05:16.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:16.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672976 2026-03-08T23:05:16.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:16.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:05:16.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672976 2026-03-08T23:05:16.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:16.749 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672976 2026-03-08T23:05:16.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672976 2026-03-08T23:05:16.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672976' 2026-03-08T23:05:16.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:16.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672977 -lt 42949672976 2026-03-08T23:05:16.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:16.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509455 2026-03-08T23:05:16.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:16.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:05:16.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509455 2026-03-08T23:05:16.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:16.924 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509455 2026-03-08T23:05:16.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509455 2026-03-08T23:05:16.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509455' 2026-03-08T23:05:16.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:17.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509455 -lt 64424509455 2026-03-08T23:05:17.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345934 2026-03-08T23:05:17.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:05:17.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345934 2026-03-08T23:05:17.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.093 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345934 2026-03-08T23:05:17.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345934 2026-03-08T23:05:17.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345934' 2026-03-08T23:05:17.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:05:17.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345934 -lt 85899345934 2026-03-08T23:05:17.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182413 2026-03-08T23:05:17.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:05:17.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182413 2026-03-08T23:05:17.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.279 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.4 seq 107374182413 2026-03-08T23:05:17.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182413 2026-03-08T23:05:17.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182413' 2026-03-08T23:05:17.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:05:17.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182413 -lt 107374182413 2026-03-08T23:05:17.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-124554051595 2026-03-08T23:05:17.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:05:17.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-124554051595 2026-03-08T23:05:17.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.448 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.5 seq 124554051595 2026-03-08T23:05:17.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051595 2026-03-08T23:05:17.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 124554051595' 2026-03-08T23:05:17.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:05:17.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051595 -lt 124554051595 2026-03-08T23:05:17.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888074 2026-03-08T23:05:17.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:05:17.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888074 2026-03-08T23:05:17.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.626 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.6 seq 146028888074 2026-03-08T23:05:17.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888074 2026-03-08T23:05:17.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888074' 2026-03-08T23:05:17.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:05:17.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888074 -lt 146028888074 2026-03-08T23:05:17.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724552 2026-03-08T23:05:17.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:05:17.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724552 2026-03-08T23:05:17.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.800 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.7 seq 167503724552 2026-03-08T23:05:17.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724552 2026-03-08T23:05:17.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724552' 2026-03-08T23:05:17.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:17.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724552 -lt 167503724552 2026-03-08T23:05:17.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:17.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561030 2026-03-08T23:05:17.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:17.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:05:17.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561030 2026-03-08T23:05:17.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:17.980 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.8 seq 188978561030 2026-03-08T23:05:17.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561030 2026-03-08T23:05:17.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561030' 2026-03-08T23:05:17.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:05:18.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561031 -lt 188978561030 2026-03-08T23:05:18.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:18.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-210453397508 2026-03-08T23:05:18.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:18.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:05:18.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-210453397508 2026-03-08T23:05:18.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:18.155 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.9 seq 210453397508 2026-03-08T23:05:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=210453397508 2026-03-08T23:05:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 210453397508' 2026-03-08T23:05:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:05:18.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 210453397509 -lt 210453397508 2026-03-08T23:05:18.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:05:18.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:18.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:05:18.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:05:18.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:05:18.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:05:18.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:05:18.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:05:18.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:18.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:18.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:838: corrupt_and_repair_lrc: corrupt_and_repair_erasure_coded td/osd-scrub-repair ecpool 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:248: corrupt_and_repair_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:249: corrupt_and_repair_erasure_coded: local poolname=ecpool 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:251: corrupt_and_repair_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:05:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:05:19.142 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:05:19.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:05:19.349 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:05:19.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:05:19.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:05:19.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:05:19.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T23:05:19.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T23:05:19.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:05:19.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:05:19.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:05:19.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:253: corrupt_and_repair_erasure_coded: local primary=3 2026-03-08T23:05:19.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T23:05:19.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:05:19.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T23:05:19.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: sed -e s/3// 2026-03-08T23:05:19.582 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:05:19.582 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:05:19.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=3 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:9 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:7' 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 3 5 9 0 6 2 1 7 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: osds=('5' '9' '0' '6' '2' '1' '7') 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:254: corrupt_and_repair_erasure_coded: local -a osds 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:255: corrupt_and_repair_erasure_coded: local not_primary_first=5 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:256: corrupt_and_repair_erasure_coded: local not_primary_second=9 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:259: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 3 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=3 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:05:19.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:05:19.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:05:19.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:05:20.516 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T23:05:21.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:05:21.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:05:21.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:05:21.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:05:21.050 INFO:tasks.workunit.client.0.vm03.stderr:start osd.3 2026-03-08T23:05:21.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:05:21.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:05:21.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:05:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:05:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:05:21.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:05:21.070 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:21.070+0000 7f98f90298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:21.077 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:21.082+0000 7f98f90298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:21.079 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:21.082+0000 7f98f90298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:05:21.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:22.034 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:22.038+0000 7f98f90298c0 -1 Falling back to public interface 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:22.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:05:22.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:23.016 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:23.018+0000 7f98f90298c0 -1 osd.3 64 log_to_monitors true 2026-03-08T23:05:23.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:23.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:23.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:05:23.602 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:23.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:23.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:05:23.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:24.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:05:24.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:25.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:25.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:25.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:05:25.963 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:25.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:25.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:osd.3 up in weight 1 up_from 68 up_thru 68 down_at 65 last_clean_interval [20,64) [v2:127.0.0.1:6826/322320860,v1:127.0.0.1:6827/322320860] [v2:127.0.0.1:6828/322320860,v1:127.0.0.1:6829/322320860] exists,up 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:05:26.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:05:26.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:05:26.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:05:26.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:05:26.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:05:26.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:05:26.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:05:26.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836501 2026-03-08T23:05:26.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836501 2026-03-08T23:05:26.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501' 2026-03-08T23:05:26.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:05:26.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672980 2026-03-08T23:05:26.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672980 2026-03-08T23:05:26.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980' 2026-03-08T23:05:26.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:05:26.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509458 2026-03-08T23:05:26.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509458 2026-03-08T23:05:26.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458' 2026-03-08T23:05:26.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:05:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776130 2026-03-08T23:05:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776130 2026-03-08T23:05:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130' 2026-03-08T23:05:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:05:26.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182416 2026-03-08T23:05:26.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182416 2026-03-08T23:05:26.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416' 2026-03-08T23:05:26.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:05:26.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051598 2026-03-08T23:05:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051598 2026-03-08T23:05:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416 5-124554051598' 2026-03-08T23:05:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:05:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888077 2026-03-08T23:05:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888077 2026-03-08T23:05:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416 5-124554051598 6-146028888077' 2026-03-08T23:05:26.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:26.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:05:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724556 2026-03-08T23:05:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724556 2026-03-08T23:05:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416 5-124554051598 6-146028888077 7-167503724556' 2026-03-08T23:05:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:27.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:05:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561034 2026-03-08T23:05:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561034 2026-03-08T23:05:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416 5-124554051598 6-146028888077 7-167503724556 8-188978561034' 2026-03-08T23:05:27.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:27.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=210453397512 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 210453397512 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672980 2-64424509458 3-292057776130 4-107374182416 5-124554051598 6-146028888077 7-167503724556 8-188978561034 9-210453397512' 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836501 2026-03-08T23:05:27.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:27.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:05:27.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836501 2026-03-08T23:05:27.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:27.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836501 2026-03-08T23:05:27.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836501' 2026-03-08T23:05:27.182 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836501 2026-03-08T23:05:27.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:27.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836501 2026-03-08T23:05:27.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:05:28.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:05:28.354 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:28.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836501 -lt 21474836501 2026-03-08T23:05:28.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:28.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672980 2026-03-08T23:05:28.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:28.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:05:28.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672980 2026-03-08T23:05:28.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:28.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672980 2026-03-08T23:05:28.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672980' 2026-03-08T23:05:28.531 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672980 2026-03-08T23:05:28.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:28.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672980 -lt 42949672980 2026-03-08T23:05:28.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:28.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509458 2026-03-08T23:05:28.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:28.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:05:28.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509458 2026-03-08T23:05:28.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:28.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509458 2026-03-08T23:05:28.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509458' 2026-03-08T23:05:28.703 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509458 2026-03-08T23:05:28.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:28.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509459 -lt 64424509458 2026-03-08T23:05:28.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:28.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776130 2026-03-08T23:05:28.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:28.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:05:28.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776130 2026-03-08T23:05:28.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776130 2026-03-08T23:05:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776130' 2026-03-08T23:05:28.882 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 292057776130 2026-03-08T23:05:28.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:05:29.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776130 -lt 292057776130 2026-03-08T23:05:29.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182416 2026-03-08T23:05:29.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:05:29.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182416 2026-03-08T23:05:29.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182416 2026-03-08T23:05:29.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182416' 2026-03-08T23:05:29.056 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182416 2026-03-08T23:05:29.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:05:29.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182416 -lt 107374182416 2026-03-08T23:05:29.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-124554051598 2026-03-08T23:05:29.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:05:29.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-124554051598 2026-03-08T23:05:29.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051598 2026-03-08T23:05:29.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 124554051598' 2026-03-08T23:05:29.235 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 124554051598 2026-03-08T23:05:29.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:05:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051599 -lt 124554051598 2026-03-08T23:05:29.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888077 2026-03-08T23:05:29.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:05:29.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888077 2026-03-08T23:05:29.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888077 2026-03-08T23:05:29.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888077' 2026-03-08T23:05:29.410 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 146028888077 2026-03-08T23:05:29.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:05:29.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888077 -lt 146028888077 2026-03-08T23:05:29.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724556 2026-03-08T23:05:29.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:05:29.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724556 2026-03-08T23:05:29.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724556 2026-03-08T23:05:29.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724556' 2026-03-08T23:05:29.587 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 167503724556 2026-03-08T23:05:29.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:29.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724556 -lt 167503724556 2026-03-08T23:05:29.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.761 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561034 2026-03-08T23:05:29.761 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:05:29.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561034 2026-03-08T23:05:29.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561034 2026-03-08T23:05:29.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561034' 2026-03-08T23:05:29.764 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 188978561034 2026-03-08T23:05:29.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:05:29.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561034 -lt 188978561034 2026-03-08T23:05:29.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:29.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-210453397512 2026-03-08T23:05:29.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:29.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:05:29.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-210453397512 2026-03-08T23:05:29.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:29.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=210453397512 2026-03-08T23:05:29.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 210453397512' 2026-03-08T23:05:29.945 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 210453397512 2026-03-08T23:05:29.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:05:30.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 210453397512 -lt 210453397512 2026-03-08T23:05:30.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:05:30.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:30.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:30.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:05:30.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:05:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:05:30.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:05:30.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:05:30.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:30.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:05:30.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:05:30.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:05:30.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:05:30.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T23:05:30.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T23:05:30.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:05:30.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:05:30.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:30.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:30.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:30.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:31.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:31.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:05:31.310 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:05:31.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:31.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:31.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:31.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:31.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:31.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:31.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:12.428145+0000 '>' 2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:31.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:32.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:12.428145+0000 '>' 2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:32.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:33.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:33.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:12.428145+0000 '>' 2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:33.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:34.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:31.998354+0000 '>' 2026-03-08T23:05:12.428145+0000 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:05:35.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:05:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:05:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:05:35.459 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:05:35.460 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T23:05:35.460 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:05:35.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:05:35.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:05:35.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:05:35.743 INFO:tasks.workunit.client.0.vm03.stderr:start osd.9 2026-03-08T23:05:35.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:05:35.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:05:35.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:05:35.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:05:35.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:05:35.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:05:35.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:35.766+0000 7eff8c4f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:35.798 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:35.778+0000 7eff8c4f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:35.798 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:35.778+0000 7eff8c4f88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:35.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:36.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:36.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:36.742+0000 7eff8c4f88c0 -1 Falling back to public interface 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:37.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:37.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:37.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:37.742+0000 7eff8c4f88c0 -1 osd.9 69 log_to_monitors true 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:38.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:38.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:39.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:osd.9 up in weight 1 up_from 73 up_thru 0 down_at 70 last_clean_interval [49,69) [v2:127.0.0.1:6874/2743609306,v1:127.0.0.1:6875/2743609306] [v2:127.0.0.1:6876/2743609306,v1:127.0.0.1:6877/2743609306] exists,up 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:05:39.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:05:39.634 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:05:39.634 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:05:39.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:05:39.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:05:39.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:05:39.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:05:39.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:05:39.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:05:39.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:05:39.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:05:39.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:05:39.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:05:39.854 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:39.854 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:39.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:05:39.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836505 2026-03-08T23:05:39.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836505 2026-03-08T23:05:39.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505' 2026-03-08T23:05:39.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:39.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:05:40.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672983 2026-03-08T23:05:40.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672983 2026-03-08T23:05:40.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983' 2026-03-08T23:05:40.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:05:40.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509462 2026-03-08T23:05:40.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509462 2026-03-08T23:05:40.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462' 2026-03-08T23:05:40.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:05:40.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776134 2026-03-08T23:05:40.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776134 2026-03-08T23:05:40.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134' 2026-03-08T23:05:40.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:05:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182420 2026-03-08T23:05:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182420 2026-03-08T23:05:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420' 2026-03-08T23:05:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:05:40.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051602 2026-03-08T23:05:40.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051602 2026-03-08T23:05:40.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420 5-124554051602' 2026-03-08T23:05:40.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:05:40.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888081 2026-03-08T23:05:40.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888081 2026-03-08T23:05:40.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420 5-124554051602 6-146028888081' 2026-03-08T23:05:40.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:05:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724559 2026-03-08T23:05:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724559 2026-03-08T23:05:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420 5-124554051602 6-146028888081 7-167503724559' 2026-03-08T23:05:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:05:40.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561037 2026-03-08T23:05:40.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561037 2026-03-08T23:05:40.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420 5-124554051602 6-146028888081 7-167503724559 8-188978561037' 2026-03-08T23:05:40.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:40.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=313532612610 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 313532612610 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836505 1-42949672983 2-64424509462 3-292057776134 4-107374182420 5-124554051602 6-146028888081 7-167503724559 8-188978561037 9-313532612610' 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836505 2026-03-08T23:05:40.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:40.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:05:40.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836505 2026-03-08T23:05:40.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:40.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836505 2026-03-08T23:05:40.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836505' 2026-03-08T23:05:40.689 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836505 2026-03-08T23:05:40.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:40.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836505 2026-03-08T23:05:40.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:40.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672983 2026-03-08T23:05:40.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:40.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:05:40.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672983 2026-03-08T23:05:40.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:40.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672983 2026-03-08T23:05:40.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672983' 2026-03-08T23:05:40.859 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672983 2026-03-08T23:05:40.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:41.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672983 -lt 42949672983 2026-03-08T23:05:41.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:41.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509462 2026-03-08T23:05:41.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:41.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:05:41.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509462 2026-03-08T23:05:41.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:41.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509462 2026-03-08T23:05:41.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509462' 2026-03-08T23:05:41.028 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509462 2026-03-08T23:05:41.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:41.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509461 -lt 64424509462 2026-03-08T23:05:41.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:05:42.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:05:42.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:42.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509463 -lt 64424509462 2026-03-08T23:05:42.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:42.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776134 2026-03-08T23:05:42.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:42.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:05:42.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776134 2026-03-08T23:05:42.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:42.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776134 2026-03-08T23:05:42.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776134' 2026-03-08T23:05:42.379 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 292057776134 2026-03-08T23:05:42.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:05:42.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776134 -lt 292057776134 2026-03-08T23:05:42.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:42.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182420 2026-03-08T23:05:42.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:42.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:05:42.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182420 2026-03-08T23:05:42.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:42.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182420 2026-03-08T23:05:42.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182420' 2026-03-08T23:05:42.554 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182420 2026-03-08T23:05:42.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:05:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182420 -lt 107374182420 2026-03-08T23:05:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:42.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-124554051602 2026-03-08T23:05:42.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:42.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:05:42.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-124554051602 2026-03-08T23:05:42.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:42.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051602 2026-03-08T23:05:42.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 124554051602' 2026-03-08T23:05:42.720 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 124554051602 2026-03-08T23:05:42.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:05:42.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051603 -lt 124554051602 2026-03-08T23:05:42.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:42.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888081 2026-03-08T23:05:42.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:42.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:05:42.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888081 2026-03-08T23:05:42.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:42.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888081 2026-03-08T23:05:42.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888081' 2026-03-08T23:05:42.882 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 146028888081 2026-03-08T23:05:42.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:05:43.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888081 -lt 146028888081 2026-03-08T23:05:43.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:43.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724559 2026-03-08T23:05:43.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:43.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:05:43.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724559 2026-03-08T23:05:43.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:43.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724559 2026-03-08T23:05:43.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724559' 2026-03-08T23:05:43.045 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 167503724559 2026-03-08T23:05:43.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:43.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724560 -lt 167503724559 2026-03-08T23:05:43.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:43.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561037 2026-03-08T23:05:43.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:43.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:05:43.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561037 2026-03-08T23:05:43.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:43.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561037 2026-03-08T23:05:43.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561037' 2026-03-08T23:05:43.214 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 188978561037 2026-03-08T23:05:43.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:05:43.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561038 -lt 188978561037 2026-03-08T23:05:43.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:43.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-313532612610 2026-03-08T23:05:43.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:43.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:05:43.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-313532612610 2026-03-08T23:05:43.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:43.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=313532612610 2026-03-08T23:05:43.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 313532612610' 2026-03-08T23:05:43.381 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 313532612610 2026-03-08T23:05:43.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:05:43.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 313532612610 -lt 313532612610 2026-03-08T23:05:43.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:05:43.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:43.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:43.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:05:43.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:05:43.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:05:43.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:05:43.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:43.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:44.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:05:44.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:05:44.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:05:44.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:05:44.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:261: corrupt_and_repair_erasure_coded: corrupt_and_repair_one td/osd-scrub-repair ecpool 5 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=ecpool 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=5 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:05:44.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:05:44.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:05:45.097 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:05:45.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:05:45.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:05:45.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:05:45.635 INFO:tasks.workunit.client.0.vm03.stderr:start osd.5 2026-03-08T23:05:45.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:05:45.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:05:45.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:05:45.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:05:45.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:05:45.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:05:45.651 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:45.650+0000 7f923609c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:45.653 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:45.658+0000 7f923609c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:45.655 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:45.658+0000 7f923609c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:45.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:05:46.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:46.614 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:46.618+0000 7f923609c8c0 -1 Falling back to public interface 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:47.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:05:47.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:47.606 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:05:47.610+0000 7f923609c8c0 -1 osd.5 74 log_to_monitors true 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:48.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:05:48.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:05:49.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:05:49.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:05:49.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:05:49.478 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:49.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:05:49.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:05:49.684 INFO:tasks.workunit.client.0.vm03.stderr:osd.5 up in weight 1 up_from 78 up_thru 78 down_at 75 last_clean_interval [29,74) [v2:127.0.0.1:6842/2985488889,v1:127.0.0.1:6843/2985488889] [v2:127.0.0.1:6844/2985488889,v1:127.0.0.1:6845/2985488889] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:05:49.685 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:05:49.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:05:49.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:05:49.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:05:49.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:49.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:05:50.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836508 2026-03-08T23:05:50.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836508 2026-03-08T23:05:50.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508' 2026-03-08T23:05:50.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:05:50.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672986 2026-03-08T23:05:50.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672986 2026-03-08T23:05:50.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986' 2026-03-08T23:05:50.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:05:50.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509465 2026-03-08T23:05:50.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509465 2026-03-08T23:05:50.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465' 2026-03-08T23:05:50.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:05:50.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776137 2026-03-08T23:05:50.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776137 2026-03-08T23:05:50.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137' 2026-03-08T23:05:50.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:05:50.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182423 2026-03-08T23:05:50.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182423 2026-03-08T23:05:50.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423' 2026-03-08T23:05:50.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:05:50.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449090 2026-03-08T23:05:50.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449090 2026-03-08T23:05:50.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423 5-335007449090' 2026-03-08T23:05:50.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:05:50.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888084 2026-03-08T23:05:50.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888084 2026-03-08T23:05:50.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423 5-335007449090 6-146028888084' 2026-03-08T23:05:50.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:05:50.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724562 2026-03-08T23:05:50.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724562 2026-03-08T23:05:50.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423 5-335007449090 6-146028888084 7-167503724562' 2026-03-08T23:05:50.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:05:50.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561040 2026-03-08T23:05:50.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561040 2026-03-08T23:05:50.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423 5-335007449090 6-146028888084 7-167503724562 8-188978561040' 2026-03-08T23:05:50.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:05:50.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=313532612613 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 313532612613 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836508 1-42949672986 2-64424509465 3-292057776137 4-107374182423 5-335007449090 6-146028888084 7-167503724562 8-188978561040 9-313532612613' 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836508 2026-03-08T23:05:50.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:50.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:05:50.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836508 2026-03-08T23:05:50.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:50.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836508 2026-03-08T23:05:50.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836508' 2026-03-08T23:05:50.790 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836508 2026-03-08T23:05:50.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:05:50.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836508 2026-03-08T23:05:50.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:50.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672986 2026-03-08T23:05:50.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:50.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:05:50.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672986 2026-03-08T23:05:50.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672986 2026-03-08T23:05:50.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672986' 2026-03-08T23:05:50.959 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672986 2026-03-08T23:05:50.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:51.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672985 -lt 42949672986 2026-03-08T23:05:51.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:05:52.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:05:52.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:05:52.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672987 -lt 42949672986 2026-03-08T23:05:52.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:52.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509465 2026-03-08T23:05:52.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:52.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:05:52.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509465 2026-03-08T23:05:52.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509465 2026-03-08T23:05:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509465' 2026-03-08T23:05:52.304 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509465 2026-03-08T23:05:52.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:05:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509466 -lt 64424509465 2026-03-08T23:05:52.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:52.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776137 2026-03-08T23:05:52.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:52.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:05:52.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776137 2026-03-08T23:05:52.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:52.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776137 2026-03-08T23:05:52.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776137' 2026-03-08T23:05:52.488 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 292057776137 2026-03-08T23:05:52.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:05:52.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776137 -lt 292057776137 2026-03-08T23:05:52.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:52.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182423 2026-03-08T23:05:52.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:52.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:05:52.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182423 2026-03-08T23:05:52.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:52.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182423 2026-03-08T23:05:52.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182423' 2026-03-08T23:05:52.682 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182423 2026-03-08T23:05:52.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:05:52.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182423 -lt 107374182423 2026-03-08T23:05:52.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:52.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449090 2026-03-08T23:05:52.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:52.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:05:52.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449090 2026-03-08T23:05:52.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:52.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449090 2026-03-08T23:05:52.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449090' 2026-03-08T23:05:52.857 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 335007449090 2026-03-08T23:05:52.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:05:53.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449090 -lt 335007449090 2026-03-08T23:05:53.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:53.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888084 2026-03-08T23:05:53.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:53.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:05:53.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888084 2026-03-08T23:05:53.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:53.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888084 2026-03-08T23:05:53.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888084' 2026-03-08T23:05:53.030 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 146028888084 2026-03-08T23:05:53.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:05:53.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888084 -lt 146028888084 2026-03-08T23:05:53.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:53.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724562 2026-03-08T23:05:53.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:53.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:05:53.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724562 2026-03-08T23:05:53.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:53.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724562 2026-03-08T23:05:53.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724562' 2026-03-08T23:05:53.200 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 167503724562 2026-03-08T23:05:53.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:05:53.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724563 -lt 167503724562 2026-03-08T23:05:53.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:53.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561040 2026-03-08T23:05:53.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:53.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:05:53.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561040 2026-03-08T23:05:53.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:53.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561040 2026-03-08T23:05:53.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561040' 2026-03-08T23:05:53.366 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 188978561040 2026-03-08T23:05:53.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:05:53.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561041 -lt 188978561040 2026-03-08T23:05:53.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-313532612613 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-313532612613 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=313532612613 2026-03-08T23:05:53.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 313532612613' 2026-03-08T23:05:53.552 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 313532612613 2026-03-08T23:05:53.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:05:54.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 313532612613 -lt 313532612613 2026-03-08T23:05:54.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:05:54.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:54.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:54.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:05:54.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:05:54.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:05:54.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:05:54.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:05:54.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg ecpool SOMETHING 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:05:54.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=2.0 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 2.0 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:55.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:55.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:55.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:55.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:55.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:05:55.321 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:05:55.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:55.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:55.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:55.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:55.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:55.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:55.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:31.998354+0000 '>' 2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:55.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:56.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:56.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:56.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:56.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:56.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:56.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:56.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:56.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:31.998354+0000 '>' 2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:56.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:57.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:57.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:57.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:31.998354+0000 '>' 2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:57.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:05:58.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:05:58.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:05:58.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:05:58.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:05:58.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:05:58.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:05:58.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:05:59.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:31.998354+0000 '>' 2026-03-08T23:05:31.998354+0000 2026-03-08T23:05:59.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:00.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:00.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:00.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:00.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:00.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:00.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:00.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:56.074271+0000 '>' 2026-03-08T23:05:31.998354+0000 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:00.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:00.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:00.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:00.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:00.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:00.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:06:00.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:00.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:00.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:06:00.637 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:06:00.637 INFO:tasks.workunit.client.0.vm03.stdout:hinfo_key 2026-03-08T23:06:00.638 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:01.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:01.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:06:01.175 INFO:tasks.workunit.client.0.vm03.stderr:start osd.9 2026-03-08T23:06:01.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:01.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:06:01.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:06:01.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:01.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:01.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:01.192 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:01.194+0000 7ff25ecd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:01.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:01.202+0000 7ff25ecd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:01.199 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:01.202+0000 7ff25ecd28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:01.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:02.158 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:02.162+0000 7ff25ecd28c0 -1 Falling back to public interface 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:02.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:02.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:03.166 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:03.170+0000 7ff25ecd28c0 -1 osd.9 80 log_to_monitors true 2026-03-08T23:06:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:03.740 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:06:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:03.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:03.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:04.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:osd.9 up in weight 1 up_from 84 up_thru 0 down_at 81 last_clean_interval [73,80) [v2:127.0.0.1:6874/3764812216,v1:127.0.0.1:6875/3764812216] [v2:127.0.0.1:6876/3764812216,v1:127.0.0.1:6877/3764812216] exists,up 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:05.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:05.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:05.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:05.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:05.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:05.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:5 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:6 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:7 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:8 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:9' 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:05.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836512 2026-03-08T23:06:05.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836512 2026-03-08T23:06:05.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512' 2026-03-08T23:06:05.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.388 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:05.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672991 2026-03-08T23:06:05.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672991 2026-03-08T23:06:05.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991' 2026-03-08T23:06:05.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:05.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509469 2026-03-08T23:06:05.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509469 2026-03-08T23:06:05.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469' 2026-03-08T23:06:05.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:05.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776141 2026-03-08T23:06:05.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776141 2026-03-08T23:06:05.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141' 2026-03-08T23:06:05.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:05.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182427 2026-03-08T23:06:05.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182427 2026-03-08T23:06:05.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427' 2026-03-08T23:06:05.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=335007449094 2026-03-08T23:06:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 335007449094 2026-03-08T23:06:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427 5-335007449094' 2026-03-08T23:06:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:05.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888088 2026-03-08T23:06:05.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888088 2026-03-08T23:06:05.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427 5-335007449094 6-146028888088' 2026-03-08T23:06:05.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724566 2026-03-08T23:06:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724566 2026-03-08T23:06:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427 5-335007449094 6-146028888088 7-167503724566' 2026-03-08T23:06:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:05.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:06.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561044 2026-03-08T23:06:06.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561044 2026-03-08T23:06:06.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427 5-335007449094 6-146028888088 7-167503724566 8-188978561044' 2026-03-08T23:06:06.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:06.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=360777252866 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 360777252866 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836512 1-42949672991 2-64424509469 3-292057776141 4-107374182427 5-335007449094 6-146028888088 7-167503724566 8-188978561044 9-360777252866' 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836512 2026-03-08T23:06:06.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:06.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:06.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836512 2026-03-08T23:06:06.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:06.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836512 2026-03-08T23:06:06.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836512' 2026-03-08T23:06:06.111 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836512 2026-03-08T23:06:06.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:06.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836512 -lt 21474836512 2026-03-08T23:06:06.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:06.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672991 2026-03-08T23:06:06.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:06.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:06.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672991 2026-03-08T23:06:06.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:06.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672991 2026-03-08T23:06:06.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672991' 2026-03-08T23:06:06.298 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672991 2026-03-08T23:06:06.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:06.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672991 -lt 42949672991 2026-03-08T23:06:06.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:06.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509469 2026-03-08T23:06:06.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:06.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:06.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509469 2026-03-08T23:06:06.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:06.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509469 2026-03-08T23:06:06.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509469' 2026-03-08T23:06:06.489 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509469 2026-03-08T23:06:06.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:06.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509469 -lt 64424509469 2026-03-08T23:06:06.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:06.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:06.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776141 2026-03-08T23:06:06.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:06.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776141 2026-03-08T23:06:06.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776141 2026-03-08T23:06:06.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776141' 2026-03-08T23:06:06.674 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.3 seq 292057776141 2026-03-08T23:06:06.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:06.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776141 -lt 292057776141 2026-03-08T23:06:06.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:06.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182427 2026-03-08T23:06:06.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:06.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:06.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:06.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182427 2026-03-08T23:06:06.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182427 2026-03-08T23:06:06.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182427' 2026-03-08T23:06:06.866 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.4 seq 107374182427 2026-03-08T23:06:06.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:07.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182427 -lt 107374182427 2026-03-08T23:06:07.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:07.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-335007449094 2026-03-08T23:06:07.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:07.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:07.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-335007449094 2026-03-08T23:06:07.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:07.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=335007449094 2026-03-08T23:06:07.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 335007449094' 2026-03-08T23:06:07.048 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.5 seq 335007449094 2026-03-08T23:06:07.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:07.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 335007449094 -lt 335007449094 2026-03-08T23:06:07.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:07.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888088 2026-03-08T23:06:07.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:07.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:07.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888088 2026-03-08T23:06:07.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:07.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888088 2026-03-08T23:06:07.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888088' 2026-03-08T23:06:07.234 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.6 seq 146028888088 2026-03-08T23:06:07.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:07.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888088 -lt 146028888088 2026-03-08T23:06:07.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:07.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724566 2026-03-08T23:06:07.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:07.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:07.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724566 2026-03-08T23:06:07.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:07.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724566 2026-03-08T23:06:07.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724566' 2026-03-08T23:06:07.423 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.7 seq 167503724566 2026-03-08T23:06:07.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:07.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724566 -lt 167503724566 2026-03-08T23:06:07.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:07.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561044 2026-03-08T23:06:07.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:07.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:07.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561044 2026-03-08T23:06:07.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:07.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561044 2026-03-08T23:06:07.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561044' 2026-03-08T23:06:07.600 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.8 seq 188978561044 2026-03-08T23:06:07.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:07.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561044 -lt 188978561044 2026-03-08T23:06:07.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:07.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-360777252866 2026-03-08T23:06:07.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:07.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:07.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-360777252866 2026-03-08T23:06:07.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:07.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=360777252866 2026-03-08T23:06:07.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 360777252866' 2026-03-08T23:06:07.791 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.9 seq 360777252866 2026-03-08T23:06:07.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:07.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 360777252866 2026-03-08T23:06:07.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:08.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:08.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 360777252866 -lt 360777252866 2026-03-08T23:06:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:09.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:09.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:09.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:09.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:09.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:09.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:09.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:09.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:09.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:09.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:06:09.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:262: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 5 9 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=5 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=9 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T23:06:09.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 219628"' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 219628' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 219630"' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 219630' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 219628 219630' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 219628 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/219631: /' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/219633: /' 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:09.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING remove 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING remove 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: remove 2#2:eb822e21:::SOMETHING:head# 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:11.057 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: start osd.9 2026-03-08T23:06:11.059 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:11.253 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:11.255 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: start osd.5 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:06:11.256 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:06:14.977 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/o219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:06:14.977 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2026-03-08T23:06:11.078+0000 7fe3184fb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2026-03-08T23:06:11.078+0000 7fe3184fb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2026-03-08T23:06:11.078+0000 7fe3184fb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 0 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2026-03-08T23:06:12.034+0000 7fe3184fb8c0 -1 Falling back to public interface 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2026-03-08T23:06:13.814+0000 7fe3184fb8c0 -1 osd.9 85 log_to_monitors true 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: 3 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:14.978 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpersd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2026-03-08T23:06:11.290+0000 7f45355d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2026-03-08T23:06:11.310+0000 7f45355d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2026-03-08T23:06:11.318+0000 7f45355d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 0 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2026-03-08T23:06:12.530+0000 7f45355d48c0 -1 Falling back to public interface 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:15.175 INFO:tasks.workunit.client.0.vm03.stderr:219631: 1 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2026-03-08T23:06:13.822+0000 7f45355d48c0 -1 osd.5 85 log_to_monitors true 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: 3 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:15.176 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:16.523 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:16.523 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 4 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: osd.9 up in weight 1 up_from 89 up_thru 0 down_at 86 last_clean_interval [84,85) [v2:127.0.0.1:6842/469518577,v1:127.0.0.1:6843/469518577] [v2:127.0.0.1:6844/469518577,v1:127.0.0.1:6845/469518577] exists,up 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 1 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 2 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 3 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 4 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 5 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 6 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 7 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 8 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: 9' 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836515 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836515 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515' 2026-03-08T23:06:16.524 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:16.842 INFO:tasks.workunit.client.0.vm03.stderr:219633: //homs.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 4 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: osd.5 up in weight 1 up_from 89 up_thru 89 down_at 86 last_clean_interval [78,85) [v2:127.0.0.1:6874/2252178202,v1:127.0.0.1:6875/2252178202] [v2:127.0.0.1:6876/2252178202,v1:127.0.0.1:6877/2252178202] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 1 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 2 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 3 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 4 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 5 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 6 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 7 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 8 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: 9' 2026-03-08T23:06:16.843 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836516 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836516 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516' 2026-03-08T23:06:16.844 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219631: e/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672994 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672994 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509472 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509472 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776144 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776144 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182430 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182430 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=382252089346 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 382252089346 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430 5-382252089346' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888091 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888091 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430 5-382252089346 6-146028888091' 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.222 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_s//home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672995 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672995 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509473 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509473 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776145 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776145 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182431 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182431 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=382252089347 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 382252089347 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431 5-382252089347' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888092 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888092 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431 5-382252089347 6-146028888092' 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:17.572 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:18.811 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flushtats: seq=167503724570 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724570 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430 5-382252089346 6-146028888091 7-167503724570' 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561048 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561048 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430 5-382252089346 6-146028888091 7-167503724570 8-188978561048' 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=382252089346 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 382252089346 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836515 1-42949672994 2-64424509472 3-292057776144 4-107374182430 5-382252089346 6-146028888091 7-167503724570 8-188978561048 9-382252089346' 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836515' 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.0 seq 21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836516 -lt 21474836515 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672994 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:18.812 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672994 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubun_pg_stats: seq=167503724571 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724571 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431 5-382252089347 6-146028888092 7-167503724571' 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561049 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561049 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431 5-382252089347 6-146028888092 7-167503724571 8-188978561049' 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=382252089347 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 382252089347 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836516 1-42949672995 2-64424509473 3-292057776145 4-107374182431 5-382252089347 6-146028888092 7-167503724571 8-188978561049 9-382252089347' 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836516' 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.0 seq 21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836514 -lt 21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836516 -lt 21474836516 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.110 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.111 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672995 2026-03-08T23:06:19.111 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:19.111 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672994 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672994' 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.1 seq 42949672994 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672995 -lt 42949672994 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509472 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509472 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509472 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509472' 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.2 seq 64424509472 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509474 -lt 64424509472 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776144 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776144 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776144 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776144' 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.3 seq 292057776144 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776145 -lt 292057776144 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182430 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182430 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182430 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182430' 2026-03-08T23:06:19.373 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.4 seq 107374182430 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.clitu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672995 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672995 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672995' 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.1 seq 42949672995 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672995 -lt 42949672995 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509473 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509473 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509473 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509473' 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.2 seq 64424509473 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509474 -lt 64424509473 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.675 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776145 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776145 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776145 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776145' 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.3 seq 292057776145 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776145 -lt 292057776145 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182431 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182431 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182431 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182431' 2026-03-08T23:06:19.676 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.4 seq 107374182431 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clonent.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182431 -lt 107374182430 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-382252089346 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-382252089346 2026-03-08T23:06:20.119 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=382252089346 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 382252089346' 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.5 seq 382252089346 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 382252089347 -lt 382252089346 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888091 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888091 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888091 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888091' 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.6 seq 146028888091 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888092 -lt 146028888091 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724570 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724570 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724570 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724570' 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.7 seq 167503724570 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724571 -lt 167503724570 2026-03-08T23:06:20.120 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: e.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182431 -lt 107374182431 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-382252089347 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-382252089347 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=382252089347 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 382252089347' 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.5 seq 382252089347 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 382252089347 -lt 382252089347 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888092 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888092 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888092 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888092' 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.6 seq 146028888092 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888092 -lt 146028888092 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724571 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724571 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724571 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724571' 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.7 seq 167503724571 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724571 -lt 167503724571 2026-03-08T23:06:20.404 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561048 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561048 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561048 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561048' 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.8 seq 188978561048 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561049 -lt 188978561048 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-382252089346 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-382252089346 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=382252089346 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 382252089346' 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: waiting osd.9 seq 382252089346 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 382252089348 -lt 382252089346 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:20.897 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:20.898 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:21.117 INFO:tasks.workunit.client.0.vm03.stderr:219633: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:21.117 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:21.117 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:21.117 INFO:tasks.workunit.client.0.vm03.stderr:219633: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:21.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561049 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561049 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561049 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561049' 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.8 seq 188978561049 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561049 -lt 188978561049 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-382252089347 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-382252089347 2026-03-08T23:06:21.140 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=382252089347 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 382252089347' 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: waiting osd.9 seq 382252089347 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 382252089348 -lt 382252089347 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:21.141 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:219631: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:219631: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 219630 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:06:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:06:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:21.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:21.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:21.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:06:21.814 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:21.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:21.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:21.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:21.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:56.074271+0000 '>' 2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:21.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:22.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:22.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:22.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:22.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:22.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:22.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:22.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:23.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:05:56.074271+0000 '>' 2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:23.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:24.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:24.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:24.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:24.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:24.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:24.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:24.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:06:22.079641+0000 '>' 2026-03-08T23:05:56.074271+0000 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 223081"' 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 223081' 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 223082"' 2026-03-08T23:06:24.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 223082' 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 223081 223082' 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 223081 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/223084: /' 2026-03-08T23:06:24.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/223086: /' 2026-03-08T23:06:24.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:24.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:25.273 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING list-attrs 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: _ 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: hinfo_key 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: snapset 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:25.274 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:06:25.276 INFO:tasks.workunit.client.0.vm03.stderr:223084: start osd.5 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_rec223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 9 SOMETHING list-attrs 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/9 SOMETHING list-attrs 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: _ 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: hinfo_key 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: snapset 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/9 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:25.294 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/9' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/9/journal' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/9 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.9 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: start osd.9 2026-03-08T23:06:25.303 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/9/whoami 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 9 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/9 --osd-journal=td/osd-scrub-repair/9/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460overy_ops 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2026-03-08T23:06:25.302+0000 7efe794368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2026-03-08T23:06:25.306+0000 7efe794368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2026-03-08T23:06:25.314+0000 7efe794368c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:29.513 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 0 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2026-03-08T23:06:26.526+0000 7efe794368c0 -1 Falling back to public interface 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 1 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2026-03-08T23:06:27.730+0000 7efe794368c0 -1 osd.5 91 log_to_monitors true 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: 3 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.514 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223084: osd.5 up in weight 1 up_from 98 up_thru 98 down_a --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 9 = 9 ']' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2026-03-08T23:06:25.338+0000 7f24bca318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2026-03-08T23:06:25.338+0000 7f24bca318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2026-03-08T23:06:25.346+0000 7f24bca318c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 9 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=9 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 0 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2026-03-08T23:06:26.042+0000 7f24bca318c0 -1 Falling back to public interface 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2026-03-08T23:06:27.222+0000 7f24bca318c0 -1 osd.9 91 log_to_monitors true 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: 3 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.9 up' 2026-03-08T23:06:29.524 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223086: osd.9 up in weight 1 up_from 96 up_thru 0 down_att 92 last_clean_interval [89,91) [v2:127.0.0.1:6874/2760692773,v1:127.0.0.1:6875/2760692773] [v2:127.0.0.1:6876/2760692773,v1:127.0.0.1:6877/2760692773] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: 1 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: 2 2026-03-08T23:06:29.951 INFO:tasks.workunit.client.0.vm03.stderr:223084: 3 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 4 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 5 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 6 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 7 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 8 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: 9' 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836520 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836520 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520' 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672998 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672998 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998' 2026-03-08T23:06:29.952 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone. 92 last_clean_interval [89,91) [v2:127.0.0.1:6842/2455326026,v1:127.0.0.1:6843/2455326026] [v2:127.0.0.1:6844/2455326026,v1:127.0.0.1:6845/2455326026] exists,up 1123dc8d-0537-43cf-bce4-3877782ccfc1 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:29.958 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 1 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 2 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 3 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 4 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 5 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 6 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 7 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 8 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: 9' 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836521 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836521 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521' 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672999 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672999 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999' 2026-03-08T23:06:29.959 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.cclient.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509477 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509477 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776150 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776150 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182435 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182435 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795010 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795010 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435 5-420906795010' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888096 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888096 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435 5-420906795010 6-146028888096' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724574 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724574 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435 5-420906795010 6-146028888096 7-167503724574' 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.632 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:30.640 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/ceplient.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:30.640 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509478 2026-03-08T23:06:30.640 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509478 2026-03-08T23:06:30.640 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478' 2026-03-08T23:06:30.640 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776149 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776149 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149' 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182436 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182436 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436' 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=420906795011 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 420906795011 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436 5-420906795011' 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888097 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888097 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436 5-420906795011 6-146028888097' 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724575 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724575 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436 5-420906795011 6-146028888097 7-167503724575' 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:30.641 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephhtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561052 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561052 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435 5-420906795010 6-146028888096 7-167503724574 8-188978561052' 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860418 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860418 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836520 1-42949672998 2-64424509477 3-292057776150 4-107374182435 5-420906795010 6-146028888096 7-167503724574 8-188978561052 9-412316860418' 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836520 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836520 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836520 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836520' 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.0 seq 21474836520 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836521 -lt 21474836520 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672998 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672998 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672998 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672998' 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.1 seq 42949672998 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672999 -lt 42949672998 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509477 2026-03-08T23:06:31.086 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephttest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561053 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561053 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436 5-420906795011 6-146028888097 7-167503724575 8-188978561053' 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860419 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860419 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836521 1-42949672999 2-64424509478 3-292057776149 4-107374182436 5-420906795011 6-146028888097 7-167503724575 8-188978561053 9-412316860419' 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836521 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836521 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836521 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836521' 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.0 seq 21474836521 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836521 -lt 21474836521 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672999 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:31.116 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672999 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672999 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672999' 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.1 seq 42949672999 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672999 -lt 42949672999 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509478 2026-03-08T23:06:31.117 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509478 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509478 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509478' 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.2 seq 64424509478 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509478 -lt 64424509478 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776149 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.822 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776149 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776149 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776149' 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.3 seq 292057776149 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776148 -lt 292057776149 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776150 -lt 292057776149 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182436 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182436 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182436 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182436' 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.4 seq 107374182436 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182436 -lt 107374182436 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.823 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-420906795011 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpersest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509477 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509477 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509477' 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.2 seq 64424509477 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509478 -lt 64424509477 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 292057776150' 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.3 seq 292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776148 -lt 292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776150 -lt 292057776150 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182435 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182435 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182435 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182435' 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.4 seq 107374182435 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182436 -lt 107374182435 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:32.830 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-420906795010 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-420906795011 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795011 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 420906795011' 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.5 seq 420906795011 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795011 -lt 420906795011 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888097 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888097 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888097 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888097' 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.6 seq 146028888097 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888097 -lt 146028888097 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724575 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724575 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724575 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724575' 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.7 seq 167503724575 2026-03-08T23:06:33.400 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724576 -lt 167503724575 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561053 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561053 2026-03-08T23:06:33.401 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561053 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/cls.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-420906795010 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=420906795010 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 420906795010' 2026-03-08T23:06:33.409 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.5 seq 420906795010 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 420906795011 -lt 420906795010 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888096 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888096 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888096 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888096' 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.6 seq 146028888096 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888097 -lt 146028888096 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724574 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724574 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724574 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724574' 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.7 seq 167503724574 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724576 -lt 167503724574 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561052 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561052 2026-03-08T23:06:33.410 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561052 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/cone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561053' 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.8 seq 188978561053 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561054 -lt 188978561053 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860419 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860419 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860419 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860419' 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: waiting osd.9 seq 412316860419 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860419 -lt 412316860419 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:223086: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:34.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:lone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561052' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.8 seq 188978561052 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561054 -lt 188978561052 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860418 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860418 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860418 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860418' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: waiting osd.9 seq 412316860418 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860419 -lt 412316860418 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:223084: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 223082 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:06:34.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:06:34.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:263: corrupt_and_repair_erasure_coded: corrupt_and_repair_two td/osd-scrub-repair ecpool 3 5 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:185: corrupt_and_repair_two: local dir=td/osd-scrub-repair 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:186: corrupt_and_repair_two: local poolname=ecpool 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:187: corrupt_and_repair_two: local first=3 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:188: corrupt_and_repair_two: local second=5 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:193: corrupt_and_repair_two: pids= 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:194: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:34.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 225998"' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 225998' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:195: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 225999"' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 225999' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:196: corrupt_and_repair_two: wait_background pids 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 225998 225999' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 225998 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/226001: /' 2026-03-08T23:06:34.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/226003: /' 2026-03-08T23:06:34.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:06:34.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING remove 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.769 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING remove 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: remove 1#2:eb822e21:::SOMETHING:head# 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:35.770 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:35.771 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: start osd.5 2026-03-08T23:06:35.772 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: remove 0#2:eb822e21:::SOMETHING:head# 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:35.781 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:06:35.782 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:35.782 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:06:35.782 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:35.782 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: start osd.3 2026-03-08T23:06:35.795 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2026-03-08T23:06:35.834+0000 7f15a77ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2026-03-08T23:06:35.846+0000 7f15a77ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2026-03-08T23:06:35.854+0000 7f15a77ca8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 0 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2026-03-08T23:06:36.802+0000 7f15a77ca8c0 -1 Falling back to public interface 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2026-03-08T23:06:37.778+0000 7f15a77ca8c0 -1 osd.3 99 log_to_monitors true 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: 3 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.794 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226001: osd.3 up in weight 1 up_from 104 up_thru 104 down_at 100 la226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2026-03-08T23:06:35.794+0000 7f2b221cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2026-03-08T23:06:35.810+0000 7f2b221cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2026-03-08T23:06:35.822+0000 7f2b221cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 0 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2026-03-08T23:06:36.762+0000 7f2b221cf8c0 -1 Falling back to public interface 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2026-03-08T23:06:37.742+0000 7f2b221cf8c0 -1 osd.5 99 log_to_monitors true 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: 3 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:39.796 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226003: osd.5 up in weight 1 up_from 104 up_thru 104 down_at 100 last_clean_interval [68,99) [v2:127.0.0.1:6874/1413462946,v1:127.0.0.1:6875/1413462946] [v2:127.0.0.1:6876/1413462946,v1:127.0.0.1:6877/1413462946] exists,up 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 1 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 2 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 3 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 4 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 5 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 6 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 7 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 8 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: 9' 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836524 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836524 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524' 2026-03-08T23:06:40.238 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.239 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:40.239 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673003 2026-03-08T23:06:40.239 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673003 2026-03-08T23:06:40.239 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003' 2026-03-08T23:06:40.239 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.st_clean_interval [98,99) [v2:127.0.0.1:6826/4074614129,v1:127.0.0.1:6827/4074614129] [v2:127.0.0.1:6828/4074614129,v1:127.0.0.1:6829/4074614129] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: 1 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: 2 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: 3 2026-03-08T23:06:40.241 INFO:tasks.workunit.client.0.vm03.stderr:226003: 4 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: 5 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: 6 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: 7 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: 8 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: 9' 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836525 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836525 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525' 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673004 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673004 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004' 2026-03-08T23:06:40.242 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509482 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509482 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482' 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.872 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598787 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598787 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787' 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182440 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182440 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440' 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598786 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598786 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440 5-446676598786' 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888100 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888100 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440 5-446676598786 6-146028888100' 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724578 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724578 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440 5-446676598786 6-146028888100 7-167503724578' 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.873 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/c0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509481 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509481 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481' 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598786 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598786 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786' 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182439 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182439 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439' 2026-03-08T23:06:40.876 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=446676598787 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 446676598787 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439 5-446676598787' 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888101 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888101 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439 5-446676598787 6-146028888101' 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724579 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724579 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439 5-446676598787 6-146028888101 7-167503724579' 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:40.877 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561057 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561057 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439 5-446676598787 6-146028888101 7-167503724579 8-188978561057' 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860422 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860422 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949673003 2-64424509481 3-446676598786 4-107374182439 5-446676598787 6-146028888101 7-167503724579 8-188978561057 9-412316860422' 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836524' 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.0 seq 21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836525 -lt 21474836524 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673003 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673003 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673003 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673003' 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.1 seq 42949673003 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:42.517 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673004 -lt 42949673003 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:22600lone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561056 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561056 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440 5-446676598786 6-146028888100 7-167503724578 8-188978561056' 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860423 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860423 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836525 1-42949673004 2-64424509482 3-446676598787 4-107374182440 5-446676598786 6-146028888100 7-167503724578 8-188978561056 9-412316860423' 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836525 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836525 2026-03-08T23:06:42.522 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836525 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836525' 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.0 seq 21474836525 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836525 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836525 -lt 21474836525 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673004 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673004 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673004 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673004' 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.1 seq 42949673004 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:42.523 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673004 -lt 42949673004 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509481 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509481 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509481 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509481' 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.2 seq 64424509481 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509483 -lt 64424509481 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-446676598786 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-446676598786 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598786 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 446676598786' 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.3 seq 446676598786 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598786 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182439 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182439 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182439 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182439' 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.4 seq 107374182439 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182440 -lt 107374182439 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.079 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-446676598787 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:22743: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509482 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509482 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509482 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509482' 2026-03-08T23:06:43.093 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.2 seq 64424509482 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509483 -lt 64424509482 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-446676598787 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-446676598787 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598787 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 446676598787' 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.3 seq 446676598787 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598787 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182440 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182440 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182440 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182440' 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.4 seq 107374182440 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182440 -lt 107374182440 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.094 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-446676598786 2026-03-08T23:06:43.661 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-446676598787 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598787 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 446676598787' 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.5 seq 446676598787 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598787 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888101 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888101 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888101 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888101' 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.6 seq 146028888101 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888101 -lt 146028888101 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724579 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724579 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724579 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724579' 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.7 seq 167503724579 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724580 -lt 167503724579 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561057 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561057 2026-03-08T23:06:43.662 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561057 2026-03-08T23:06:43.676 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.clie: flush_pg_stats: osd=5 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-446676598786 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=446676598786 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 446676598786' 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.5 seq 446676598786 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 446676598787 -lt 446676598786 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888100 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888100 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888100 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888100' 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.6 seq 146028888100 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888101 -lt 146028888100 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724578 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724578 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724578 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724578' 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.7 seq 167503724578 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724580 -lt 167503724578 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561056 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:43.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:43.678 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561056 2026-03-08T23:06:43.678 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561056 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561057' 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.8 seq 188978561057 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561058 -lt 188978561057 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860422 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860422 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860422 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860422' 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: waiting osd.9 seq 412316860422 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860423 -lt 412316860422 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:226001: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:44.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 225999 2026-03-08T23:06:44.703 INFO:tasks.workunit.client.0.vm03.stderr:nt.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561056' 2026-03-08T23:06:44.703 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.8 seq 188978561056 2026-03-08T23:06:44.703 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561058 -lt 188978561056 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860423 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860423 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860423 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860423' 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: waiting osd.9 seq 412316860423 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860423 -lt 412316860423 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:226003: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:06:44.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:197: corrupt_and_repair_two: return_code=0 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:198: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: get_pg ecpool SOMETHING 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:06:44.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:203: corrupt_and_repair_two: local pg=2.0 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:204: corrupt_and_repair_two: repair 2.0 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:44.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:45.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:45.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:06:45.207 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.3 to repair 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:45.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:45.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:45.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:45.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:45.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:06:22.079641+0000 '>' 2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:45.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:46.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:46.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:46.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:06:22.079641+0000 '>' 2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:46.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:47.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:47.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:47.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:47.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:47.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:47.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:47.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:47.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:06:22.079641+0000 '>' 2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:47.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:06:48.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:06:48.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:06:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:06:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:06:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:06:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:06:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:06:45.733110+0000 '>' 2026-03-08T23:06:22.079641+0000 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:208: corrupt_and_repair_two: pids= 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:209: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 229427"' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 229427' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:210: corrupt_and_repair_two: run_in_background pids objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 229428"' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 229428' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:211: corrupt_and_repair_two: wait_background pids 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 229427 229428' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 229427 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/229430: /' 2026-03-08T23:06:48.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/229432: /' 2026-03-08T23:06:48.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:06:48.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=5 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.661 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 5 SOMETHING list-attrs 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/5 SOMETHING list-attrs 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: _ 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: hinfo_key 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: snapset 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/5 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:49.662 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/5' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/5/journal' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:49.663 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/5 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.5 2026-03-08T23:06:49.664 INFO:tasks.workunit.client.0.vm03.stderr:229432: start osd.5 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 5 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/5 --osd-journal=td/osd-scrub-repair/5/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_rec229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING list-attrs 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING list-attrs 2026-03-08T23:06:49.669 INFO:tasks.workunit.client.0.vm03.stderr:229430: Error getting attr on : 2.0s0_head,0#-4:00000000:::scrub_2.0s0:head#, (61) No data available 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: _ 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: hinfo_key 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: snapset 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:06:49.670 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:49.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: start osd.3 2026-03-08T23:06:49.686 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:49.710+0000 7f6171f428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:49.718+0000 7f6171f428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:49.718+0000 7f6171f428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 0 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:50.934+0000 7f6171f428c0 -1 Falling back to public interface 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:52.094+0000 7f6171f428c0 -1 osd.3 105 log_to_monitors true 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2026-03-08T23:06:53.398+0000 7f6168ef2640 -1 osd.3 105 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:53.484 INFO:tasks.workunit.client.0.vm03.stderr:229430: 3 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuovery_ops 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/5/whoami 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 5 = 5 ']' 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2026-03-08T23:06:49.686+0000 7f71be3398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2026-03-08T23:06:49.694+0000 7f71be3398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2026-03-08T23:06:49.706+0000 7f71be3398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 5 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:06:53.641 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=5 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 0 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2026-03-08T23:06:50.922+0000 7f71be3398c0 -1 Falling back to public interface 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2026-03-08T23:06:51.694+0000 7f71be3398c0 -1 osd.5 105 log_to_monitors true 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: 3 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:53.642 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.5 up' 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: osd.5 up in weight 1 up_from 111 up_thru 111 down_at 106 last_clean_interval [104,105) [v2:127.0.0.1:6826/1366422689,v1:127.0.0.1:6827/1366422689] [v2:127.0.0.1:6828/1366422689,v1:127.0.0.1:6829/1366422689] exists,up 8d6c8677-0ab7-4e74-91ad-7f9acd06e683 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: 1 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: 2 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: 3 2026-03-08T23:06:54.065 INFO:tasks.workunit.client.0.vm03.stderr:229432: 4 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: 5 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: 6 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: 7 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: 8 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: 9' 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836529 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836529 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529' 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673007 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673007 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007' 2026-03-08T23:06:54.066 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/ntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: osd.3 up in weight 1 up_from 113 up_thru 113 down_at 106 last_clean_interval [104,105) [v2:127.0.0.1:6874/146570041,v1:127.0.0.1:6875/146570041] [v2:127.0.0.1:6876/146570041,v1:127.0.0.1:6877/146570041] exists,up 3753215e-6921-4e7a-a0f3-4ad20a9c1a4f 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: 1 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: 2 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: 3 2026-03-08T23:06:54.076 INFO:tasks.workunit.client.0.vm03.stderr:229430: 4 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: 5 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: 6 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: 7 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: 8 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: 9' 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836530 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836530 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530' 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673008 2026-03-08T23:06:54.077 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673008 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509487 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509487 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=485331304451 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 485331304451 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182444 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182444 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=476741369859 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 476741369859 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444 5-476741369859' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888105 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888105 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444 5-476741369859 6-146028888105' 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724584 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724584 2026-03-08T23:06:54.685 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444 5-476741369859 6-146028888105 7-167503724584' 2026-03-08T23:06:54.760 INFO:tasks.workunit.client.0.vm03.stderr:2294clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509486 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509486 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=485331304450 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 485331304450 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.4 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182443 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182443 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.5 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=476741369858 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 476741369858 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443 5-476741369858' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.6 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888104 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888104 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443 5-476741369858 6-146028888104' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.7 flush_pg_stats 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724583 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724583 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443 5-476741369858 6-146028888104 7-167503724583' 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:54.761 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561061 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561061 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443 5-476741369858 6-146028888104 7-167503724583 8-188978561061' 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860427 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860427 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673007 2-64424509486 3-485331304450 4-107374182443 5-476741369858 6-146028888104 7-167503724583 8-188978561061 9-412316860427' 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:55.242 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836529 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836529 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836529 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836529' 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.0 seq 21474836529 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836530 -lt 21474836529 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673007 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673007 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673007 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673007' 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.1 seq 42949673007 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673008 -lt 42949673007 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509486 2026-03-08T23:06:55.243 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu30: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.8 flush_pg_stats 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561062 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561062 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444 5-476741369859 6-146028888105 7-167503724584 8-188978561062' 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.9 flush_pg_stats 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=412316860428 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 412316860428 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673008 2-64424509487 3-485331304451 4-107374182444 5-476741369859 6-146028888105 7-167503724584 8-188978561062 9-412316860428' 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836530 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836530 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836530 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836530' 2026-03-08T23:06:55.263 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.0 seq 21474836530 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836530 -lt 21474836530 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673008 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673008 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673008 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673008' 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.1 seq 42949673008 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673008 -lt 42949673008 2026-03-08T23:06:55.264 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: /cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509486 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509486 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509486' 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.2 seq 64424509486 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509485 -lt 64424509486 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509487 -lt 64424509486 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-485331304450 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-485331304450 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=485331304450 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 485331304450' 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.3 seq 485331304450 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 485331304451 -lt 485331304450 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182443 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182443 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182443 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182443' 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.4 seq 107374182443 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182445 -lt 107374182443 2026-03-08T23:06:57.002 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.003 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.003 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-476741369858 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-he cut -d - -f 1 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509487' 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.2 seq 64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509485 -lt 64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509487 -lt 64424509487 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-485331304451 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.021 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-485331304451 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=485331304451 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 485331304451' 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.3 seq 485331304451 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 485331304451 -lt 485331304451 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 4-107374182444 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=4 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 4-107374182444 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182444 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.4 seq 107374182444' 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.4 seq 107374182444 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 4 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182445 -lt 107374182444 2026-03-08T23:06:57.022 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.clilpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-476741369858 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=476741369858 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 476741369858' 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.5 seq 476741369858 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 476741369859 -lt 476741369858 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888104 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888104 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888104 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888104' 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.6 seq 146028888104 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888106 -lt 146028888104 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.569 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724583 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724583 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724583 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724583' 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.7 seq 167503724583 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724584 -lt 167503724583 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561061 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561061 2026-03-08T23:06:57.570 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561061 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephteent.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 5-476741369859 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=5 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 5-476741369859 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=476741369859 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.5 seq 476741369859' 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.5 seq 476741369859 2026-03-08T23:06:57.579 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 5 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 476741369859 -lt 476741369859 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 6-146028888105 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=6 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 6-146028888105 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888105 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.6 seq 146028888105' 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.6 seq 146028888105 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 6 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888106 -lt 146028888105 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 7-167503724584 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=7 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 7-167503724584 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724584 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.7 seq 167503724584' 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.7 seq 167503724584 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 7 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724584 -lt 167503724584 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 8-188978561062 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=8 2026-03-08T23:06:57.580 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229st/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561061' 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.8 seq 188978561061 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561062 -lt 188978561061 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860427 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860427 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860427 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860427' 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: waiting osd.9 seq 412316860427 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860428 -lt 412316860427 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:58.578 INFO:tasks.workunit.client.0.vm03.stderr:229432: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:58.579 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:58.579 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:58.579 INFO:tasks.workunit.client.0.vm03.stderr:229432: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:58.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 8-188978561062 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561062 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.8 seq 188978561062' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.8 seq 188978561062 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 8 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561062 -lt 188978561062 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 9-412316860428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=9 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 9-412316860428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=412316860428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.9 seq 412316860428' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: waiting osd.9 seq 412316860428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 9 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 412316860428 -lt 412316860428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:229430: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 229428 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:212: corrupt_and_repair_two: return_code=0 2026-03-08T23:06:58.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:213: corrupt_and_repair_two: '[' 0 -ne 0 ']' 2026-03-08T23:06:58.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:215: corrupt_and_repair_two: rados --pool ecpool get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:06:58.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:216: corrupt_and_repair_two: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:58.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:58.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:58.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:06:58.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:06:58.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:06:58.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:06:58.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:06:58.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:06:58.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:06:58.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:06:58.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:06:58.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:06:58.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:06:58.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:06:58.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:06:58.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:06:58.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:06:58.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:06:58.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:06:58.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:06:58.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:06:58.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:06:58.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:06:58.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:06:58.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:06:58.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:06:58.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:06:58.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:06:58.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:06:58.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:06:58.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:06:58.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:06:58.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:06:58.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:06:58.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:06:58.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:06:58.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:06:58.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:06:58.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:06:58.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:06:58.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:06:58.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_and_repair_replicated td/osd-scrub-repair 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:94: TEST_corrupt_and_repair_replicated: local dir=td/osd-scrub-repair 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:95: TEST_corrupt_and_repair_replicated: local poolname=rbd 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:97: TEST_corrupt_and_repair_replicated: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:06:58.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T23:06:58.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:06:58.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:58.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:58.868 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:58.868 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.868 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:58.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:06:58.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:06:58.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:06:58.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:06:58.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:06:58.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:06:58.899 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:06:58.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:06:58.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:06:58.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:06:58.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:06:58.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:06:58.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:06:58.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:98: TEST_corrupt_and_repair_replicated: run_mgr td/osd-scrub-repair x 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:06:59.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:59.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:59.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:06:59.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:99: TEST_corrupt_and_repair_replicated: run_osd td/osd-scrub-repair 0 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:06:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:06:59.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:06:59.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:06:59.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:06:59.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0d979348-dd78-4365-8273-67ad3aed54c4 2026-03-08T23:06:59.171 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 0d979348-dd78-4365-8273-67ad3aed54c4 2026-03-08T23:06:59.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 0d979348-dd78-4365-8273-67ad3aed54c4' 2026-03-08T23:06:59.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:06:59.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQATAa5pyTg3CxAAl0AzBpkc0YMBa/ax5CunQA== 2026-03-08T23:06:59.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQATAa5pyTg3CxAAl0AzBpkc0YMBa/ax5CunQA=="}' 2026-03-08T23:06:59.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0d979348-dd78-4365-8273-67ad3aed54c4 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:06:59.284 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:06:59.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:06:59.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQATAa5pyTg3CxAAl0AzBpkc0YMBa/ax5CunQA== --osd-uuid 0d979348-dd78-4365-8273-67ad3aed54c4 2026-03-08T23:06:59.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:59.314+0000 7f70937b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:59.318 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:59.322+0000 7f70937b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:59.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:59.322+0000 7f70937b88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:06:59.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:59.322+0000 7f70937b88c0 -1 bdev(0x562ba30aac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:06:59.319 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:06:59.322+0000 7f70937b88c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:07:01.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:07:01.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:07:01.599 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:07:01.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:07:01.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:07:01.697 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:07:01.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:07:01.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:07:01.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:07:01.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:07:01.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:01.753 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:01.754+0000 7f1164d668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:01.755 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:01.758+0000 7f1164d668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:01.757 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:01.758+0000 7f1164d668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:01.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:02.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:02.951 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:02.954+0000 7f1164d668c0 -1 Falling back to public interface 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:03.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:03.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:04.174 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:04.174+0000 7f1164d668c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:07:04.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:04.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:04.178 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:07:04.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:04.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:04.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:04.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:05.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:05.527 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3385852343,v1:127.0.0.1:6803/3385852343] [v2:127.0.0.1:6804/3385852343,v1:127.0.0.1:6805/3385852343] exists,up 0d979348-dd78-4365-8273-67ad3aed54c4 2026-03-08T23:07:05.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:05.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:100: TEST_corrupt_and_repair_replicated: run_osd td/osd-scrub-repair 1 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:05.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:07:05.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:07:05.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:07:05.531 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:05.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:05.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 9eca177f-3df2-4524-8d7b-f76bc1689b44' 2026-03-08T23:07:05.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:07:05.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAZAa5pdmezIBAA/opT3ofxSswtDz+bNgrvBg== 2026-03-08T23:07:05.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAZAa5pdmezIBAA/opT3ofxSswtDz+bNgrvBg=="}' 2026-03-08T23:07:05.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 9eca177f-3df2-4524-8d7b-f76bc1689b44 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:07:05.698 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:07:05.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:07:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAZAa5pdmezIBAA/opT3ofxSswtDz+bNgrvBg== --osd-uuid 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:05.727 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:05.730+0000 7f4ac0c168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:05.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:05.730+0000 7f4ac0c168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:05.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:05.734+0000 7f4ac0c168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:05.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:05.734+0000 7f4ac0c168c0 -1 bdev(0x56196170bc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:07:05.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:05.734+0000 7f4ac0c168c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:07:07.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:07:07.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:07:07.987 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:07:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:07:07.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:07:08.179 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:07:08.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:07:08.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:08.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:07:08.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:07:08.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:07:08.202 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:08.202+0000 7fede390e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:08.210 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:08.214+0000 7fede390e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:08.212 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:08.214+0000 7fede390e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:08.346 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:08.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:08.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:09.175 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:09.178+0000 7fede390e8c0 -1 Falling back to public interface 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:09.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:09.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:10.155 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:10.158+0000 7fede390e8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:10.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:10.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:11.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1875627174,v1:127.0.0.1:6811/1875627174] [v2:127.0.0.1:6812/1875627174,v1:127.0.0.1:6813/1875627174] exists,up 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:101: TEST_corrupt_and_repair_replicated: create_rbd_pool 2026-03-08T23:07:12.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:07:12.205 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:07:12.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:07:12.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:07:12.397 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:07:12.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:07:13.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:07:13.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:102: TEST_corrupt_and_repair_replicated: wait_for_clean 2026-03-08T23:07:13.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:07:13.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:07:13.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:07:13.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:07:13.701 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:07:13.701 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:07:13.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:07:13.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:07:13.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:07:13.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:07:13.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:07:13.939 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:07:13.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:07:13.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:13.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:07:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:07:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:07:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:07:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:14.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:07:14.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:14.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:07:14.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:07:14.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:14.096 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:07:14.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:07:14.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:07:14.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:14.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:07:14.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:07:15.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:07:15.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:15.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:07:15.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:07:16.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:07:16.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:16.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836483 2026-03-08T23:07:16.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:16.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:07:16.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:16.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:07:16.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:07:16.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:16.610 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:07:16.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:07:16.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:07:16.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:07:16.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:07:16.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:07:16.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:16.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:07:16.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:07:17.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:07:17.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:07:17.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:17.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:104: TEST_corrupt_and_repair_replicated: add_something td/osd-scrub-repair rbd 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=rbd 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:07:17.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:07:17.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:07:17.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:07:17.565 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:07:17.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:07:17.779 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:07:17.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:07:17.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:07:17.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool rbd put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:105: TEST_corrupt_and_repair_replicated: get_not_primary rbd SOMETHING 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=rbd 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary rbd SOMETHING 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=rbd 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T23:07:17.819 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:07:17.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:07:17.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T23:07:17.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:07:18.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:105: TEST_corrupt_and_repair_replicated: corrupt_and_repair_one td/osd-scrub-repair rbd 0 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=rbd 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=0 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:07:18.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:07:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:07:18.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T23:07:18.916 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:19.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:07:19.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:07:19.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:07:19.447 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:07:19.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:19.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:07:19.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:07:19.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:07:19.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:07:19.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:07:19.461 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:19.461+0000 7fcf758628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:19.463 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:19.465+0000 7fcf758628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:19.465 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:19.465+0000 7fcf758628c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:19.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:19.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:20.430 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:20.433+0000 7fcf758628c0 -1 Falling back to public interface 2026-03-08T23:07:20.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:20.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:20.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:20.766 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:07:20.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:20.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:20.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:21.910 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:21.913+0000 7fcf758628c0 -1 osd.0 20 log_to_monitors true 2026-03-08T23:07:21.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:21.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:21.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:21.923 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:07:21.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:21.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:22.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:22.789 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:22.789+0000 7fcf6c812640 -1 osd.0 20 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:23.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 25 up_thru 25 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/1649999779,v1:127.0.0.1:6803/1649999779] [v2:127.0.0.1:6804/1649999779,v1:127.0.0.1:6805/1649999779] exists,up 0d979348-dd78-4365-8273-67ad3aed54c4 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:07:23.266 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:07:23.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:07:23.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:07:23.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:07:23.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:07:23.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:07:23.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:07:23.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:07:23.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:07:23.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:07:23.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:07:23.479 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:07:23.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:07:23.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:23.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:07:23.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182402 2026-03-08T23:07:23.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182402 2026-03-08T23:07:23.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182402' 2026-03-08T23:07:23.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:23.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:07:23.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T23:07:23.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T23:07:23.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182402 1-42949672965' 2026-03-08T23:07:23.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:23.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182402 2026-03-08T23:07:23.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:23.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:07:23.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182402 2026-03-08T23:07:23.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:23.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182402 2026-03-08T23:07:23.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182402' 2026-03-08T23:07:23.629 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182402 2026-03-08T23:07:23.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:23.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182402 -lt 107374182402 2026-03-08T23:07:23.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:23.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T23:07:23.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:23.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:07:23.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T23:07:23.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:23.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T23:07:23.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T23:07:23.794 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672965 2026-03-08T23:07:23.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:07:23.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T23:07:23.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:07:23.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:23.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:07:24.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:07:24.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:07:24.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:07:24.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:07:24.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:07:24.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:24.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:24.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:07:24.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:07:24.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:07:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg rbd SOMETHING 2026-03-08T23:07:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=rbd 2026-03-08T23:07:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:07:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map rbd SOMETHING 2026-03-08T23:07:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=1.3 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 1.3 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.3 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.3 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:24.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:24.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:24.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.3 2026-03-08T23:07:24.987 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.3 on osd.1 to repair 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.3 2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.3 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:07:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:25.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:25.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:25.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:25.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:25.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:25.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:12.399687+0000 '>' 2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:25.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:26.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:26.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:26.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:26.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:26.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:26.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:26.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:26.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:12.399687+0000 '>' 2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:26.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:27.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:27.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:12.399687+0000 '>' 2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:27.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:28.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:12.399687+0000 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:07:28.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:28.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:28.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:28.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:28.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:28.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T23:07:29.094 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T23:07:29.094 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:07:29.094 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:29.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:29.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:07:29.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:29.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:07:29.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:07:29.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:07:29.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:07:29.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:07:29.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:29.401+0000 7f479574c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:29.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:29.401+0000 7f479574c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:29.402 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:29.401+0000 7f479574c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:29.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:30.595 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:30.597+0000 7f479574c8c0 -1 Falling back to public interface 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:30.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:30.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:31.600 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:31.601+0000 7f479574c8c0 -1 osd.1 26 log_to_monitors true 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:32.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:32.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:32.489+0000 7f478c6fc640 -1 osd.1 26 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:33.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 30 up_thru 30 down_at 27 last_clean_interval [10,26) [v2:127.0.0.1:6810/3990410565,v1:127.0.0.1:6811/3990410565] [v2:127.0.0.1:6812/3990410565,v1:127.0.0.1:6813/3990410565] exists,up 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:07:33.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:07:33.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:07:33.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:07:33.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:07:33.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:07:33.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:07:33.457 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:07:33.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:07:33.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:33.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:07:33.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182405 2026-03-08T23:07:33.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182405 2026-03-08T23:07:33.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182405' 2026-03-08T23:07:33.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:33.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=128849018882 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 128849018882 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182405 1-128849018882' 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182405 2026-03-08T23:07:33.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:33.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:07:33.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182405 2026-03-08T23:07:33.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182405 2026-03-08T23:07:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182405' 2026-03-08T23:07:33.616 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182405 2026-03-08T23:07:33.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:33.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182405 2026-03-08T23:07:33.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:07:34.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:07:34.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:34.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182405 -lt 107374182405 2026-03-08T23:07:34.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:34.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-128849018882 2026-03-08T23:07:34.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:34.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:07:34.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-128849018882 2026-03-08T23:07:34.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:34.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=128849018882 2026-03-08T23:07:34.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 128849018882' 2026-03-08T23:07:34.951 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 128849018882 2026-03-08T23:07:34.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:07:35.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 128849018882 -lt 128849018882 2026-03-08T23:07:35.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:07:35.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:35.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:07:35.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:07:35.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:07:35.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:07:35.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:35.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:35.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:07:35.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:07:35.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:07:35.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool rbd get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:07:35.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:07:35.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:107: TEST_corrupt_and_repair_replicated: get_primary rbd SOMETHING 2026-03-08T23:07:35.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=rbd 2026-03-08T23:07:35.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:07:35.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T23:07:35.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:107: TEST_corrupt_and_repair_replicated: corrupt_and_repair_one td/osd-scrub-repair rbd 1 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:226: corrupt_and_repair_one: local dir=td/osd-scrub-repair 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:227: corrupt_and_repair_one: local poolname=rbd 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:228: corrupt_and_repair_one: local osd=1 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:233: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:35.869 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:35.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:35.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:35.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING remove 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:36.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING remove 2026-03-08T23:07:36.822 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eb822e21:::SOMETHING:head# 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:37.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:07:37.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:07:37.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:07:37.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:07:37.355 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:07:37.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:37.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:07:37.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:07:37.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:07:37.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:07:37.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:07:37.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:37.373+0000 7f8f0f11c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:37.372 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:37.373+0000 7f8f0f11c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:37.374 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:37.373+0000 7f8f0f11c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:37.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:38.067 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:38.069+0000 7f8f0f11c8c0 -1 Falling back to public interface 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:38.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:38.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:39.048 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:39.049+0000 7f8f0f11c8c0 -1 osd.1 31 log_to_monitors true 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:39.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:40.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:41.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 35 up_thru 35 down_at 32 last_clean_interval [30,31) [v2:127.0.0.1:6810/3196131836,v1:127.0.0.1:6811/3196131836] [v2:127.0.0.1:6812/3196131836,v1:127.0.0.1:6813/3196131836] exists,up 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:07:41.197 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:07:41.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:07:41.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:07:41.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:07:41.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:07:41.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:07:41.410 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:07:41.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:07:41.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:41.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:07:41.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182407 2026-03-08T23:07:41.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182407 2026-03-08T23:07:41.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182407' 2026-03-08T23:07:41.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:41.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855362 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855362 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182407 1-150323855362' 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182407 2026-03-08T23:07:41.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:07:41.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182407 2026-03-08T23:07:41.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:41.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182407 2026-03-08T23:07:41.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182407' 2026-03-08T23:07:41.566 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182407 2026-03-08T23:07:41.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:41.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182406 -lt 107374182407 2026-03-08T23:07:41.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:07:42.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:07:42.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:42.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182407 -lt 107374182407 2026-03-08T23:07:42.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:42.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-150323855362 2026-03-08T23:07:42.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:42.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:07:42.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-150323855362 2026-03-08T23:07:42.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:42.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855362 2026-03-08T23:07:42.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 150323855362' 2026-03-08T23:07:42.913 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 150323855362 2026-03-08T23:07:42.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:07:43.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855362 -lt 150323855362 2026-03-08T23:07:43.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:07:43.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:43.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:07:43.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:07:43.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:07:43.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:07:43.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:43.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: get_pg rbd SOMETHING 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=rbd 2026-03-08T23:07:43.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:07:43.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map rbd SOMETHING 2026-03-08T23:07:43.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:237: corrupt_and_repair_one: local pg=1.3 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:238: corrupt_and_repair_one: repair 1.3 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.3 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.3 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:43.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:43.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.3 2026-03-08T23:07:44.078 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.3 on osd.1 to repair 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.3 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.3 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:44.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:44.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:44.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:45.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:45.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:45.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:45.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:45.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:45.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:45.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:45.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:46.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:46.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:46.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:47.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:47.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:47.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:48.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:48.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:48.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:25.318601+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:48.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:07:49.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:07:50.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:07:44.963691+0000 '>' 2026-03-08T23:07:25.318601+0000 2026-03-08T23:07:50.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:242: corrupt_and_repair_one: objectstore_tool td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:50.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 SOMETHING list-attrs 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:50.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 SOMETHING list-attrs 2026-03-08T23:07:50.526 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.3_head,#-3:c0000000:::scrub_1.3:head#, (61) No data available 2026-03-08T23:07:50.526 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:07:50.526 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:50.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:07:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:07:50.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:07:50.816 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:07:50.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:07:50.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:07:50.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:07:50.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:07:50.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:07:50.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:07:50.832 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:50.833+0000 7f26913d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:50.832 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:50.833+0000 7f26913d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:50.833 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:50.833+0000 7f26913d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:07:50.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:50.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:07:50.990 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:07:50.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:50.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:51.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:52.023 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:52.025+0000 7f26913d88c0 -1 Falling back to public interface 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:52.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:52.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:53.014 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:53.017+0000 7f26913d88c0 -1 osd.1 37 log_to_monitors true 2026-03-08T23:07:53.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:53.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:53.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:07:53.326 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:07:53.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:53.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:53.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:07:53.923 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:53.925+0000 7f2688388640 -1 osd.1 37 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:07:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 41 up_thru 41 down_at 38 last_clean_interval [35,37) [v2:127.0.0.1:6810/3983019127,v1:127.0.0.1:6811/3983019127] [v2:127.0.0.1:6812/3983019127,v1:127.0.0.1:6813/3983019127] exists,up 9eca177f-3df2-4524-8d7b-f76bc1689b44 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:07:54.653 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:07:54.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:07:54.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:07:54.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:07:54.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:07:54.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:07:54.871 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:07:54.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:07:54.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:54.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:07:54.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182411 2026-03-08T23:07:54.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182411 2026-03-08T23:07:54.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182411' 2026-03-08T23:07:54.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:07:54.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659138 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659138 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182411 1-176093659138' 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182411 2026-03-08T23:07:55.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:55.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:07:55.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182411 2026-03-08T23:07:55.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:55.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182411 2026-03-08T23:07:55.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182411' 2026-03-08T23:07:55.014 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 107374182411 2026-03-08T23:07:55.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:55.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182410 -lt 107374182411 2026-03-08T23:07:55.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:07:56.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:07:56.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:07:56.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182411 -lt 107374182411 2026-03-08T23:07:56.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:07:56.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-176093659138 2026-03-08T23:07:56.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:07:56.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:07:56.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-176093659138 2026-03-08T23:07:56.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:07:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659138 2026-03-08T23:07:56.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 176093659138' 2026-03-08T23:07:56.338 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 176093659138 2026-03-08T23:07:56.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:07:56.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659138 -lt 176093659138 2026-03-08T23:07:56.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:07:56.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:56.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:07:56.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:07:56.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:07:56.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:07:56.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:07:56.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:07:56.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:07:57.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:07:57.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:07:57.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:07:57.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:243: corrupt_and_repair_one: rados --pool rbd get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:07:57.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:244: corrupt_and_repair_one: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:57.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:57.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:57.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:57.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:57.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:07:57.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:07:57.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:07:57.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:07:57.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:07:57.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:07:57.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:07:57.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:07:57.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:07:57.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:07:57.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:07:57.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:07:57.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:07:57.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:07:57.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:07:57.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:07:57.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:07:57.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:07:57.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:07:57.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:07:57.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:07:57.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:07:57.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:07:57.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:07:57.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:07:57.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:07:57.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:07:57.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:07:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:07:57.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:07:57.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:07:57.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:07:57.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:07:57.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:07:57.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:07:57.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:07:57.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:07:57.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_scrub_erasure_appends td/osd-scrub-repair 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5746: TEST_corrupt_scrub_erasure_appends: corrupt_scrub_erasure td/osd-scrub-repair false 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3596: corrupt_scrub_erasure: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3597: corrupt_scrub_erasure: local allow_overwrites=false 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3598: corrupt_scrub_erasure: local poolname=ecpool 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3599: corrupt_scrub_erasure: local total_objs=7 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3601: corrupt_scrub_erasure: run_mon td/osd-scrub-repair a 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:07:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:07:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:07:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:57.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:57.248 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:57.249 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.249 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:57.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:07:57.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:07:57.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:07:57.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:07:57.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:07:57.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:07:57.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:07:57.278 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:07:57.278 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:07:57.278 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:07:57.286 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:07:57.287 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.287 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.287 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:07:57.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:07:57.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:07:57.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:07:57.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:07:57.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:07:57.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.350 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:07:57.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:07:57.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3602: corrupt_scrub_erasure: run_mgr td/osd-scrub-repair x 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:07:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.514 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:57.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:07:57.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:07:57.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: seq 0 2 2026-03-08T23:07:57.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 0 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:07:57.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:07:57.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:07:57.545 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:07:57.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:07:57.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 c4e8c760-134f-405e-a9e9-2b0353c4d3de' 2026-03-08T23:07:57.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:07:57.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBNAa5py2l1IRAAD2i7gPj000MYLsZqXr4BfA== 2026-03-08T23:07:57.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBNAa5py2l1IRAAD2i7gPj000MYLsZqXr4BfA=="}' 2026-03-08T23:07:57.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c4e8c760-134f-405e-a9e9-2b0353c4d3de -i td/osd-scrub-repair/0/new.json 2026-03-08T23:07:57.657 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:07:57.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:07:57.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBNAa5py2l1IRAAD2i7gPj000MYLsZqXr4BfA== --osd-uuid c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:07:57.689 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:57.689+0000 7ffa84db68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:57.690 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:57.693+0000 7ffa84db68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:57.692 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:57.693+0000 7ffa84db68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:07:57.692 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:57.693+0000 7ffa84db68c0 -1 bdev(0x555763ddcc00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:07:57.692 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:07:57.693+0000 7ffa84db68c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:07:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:07:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:07:59.980 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:07:59.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:07:59.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:08:00.119 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:08:00.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:08:00.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:00.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:08:00.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:08:00.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:08:00.145 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:00.145+0000 7ff197d998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:00.147 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:00.149+0000 7ff197d998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:00.149 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:00.149+0000 7ff197d998c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:00.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:00.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:01.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:01.113+0000 7ff197d998c0 -1 Falling back to public interface 2026-03-08T23:08:01.451 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:08:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:01.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:02.085 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:02.085+0000 7ff197d998c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:02.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:02.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:03.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1259454056,v1:127.0.0.1:6803/1259454056] [v2:127.0.0.1:6804/1259454056,v1:127.0.0.1:6805/1259454056] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 1 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:08:03.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:03.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:08:03.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:08:03.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:08:03.962 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:03.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:03.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 3d9539b6-0f15-4899-81a5-1a6d617f533b' 2026-03-08T23:08:03.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:08:03.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBTAa5pvBhfOhAA/RHubOIgY+xU0COvv/zvZA== 2026-03-08T23:08:03.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBTAa5pvBhfOhAA/RHubOIgY+xU0COvv/zvZA=="}' 2026-03-08T23:08:03.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3d9539b6-0f15-4899-81a5-1a6d617f533b -i td/osd-scrub-repair/1/new.json 2026-03-08T23:08:04.135 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:08:04.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:08:04.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBTAa5pvBhfOhAA/RHubOIgY+xU0COvv/zvZA== --osd-uuid 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:04.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:04.165+0000 7f32723d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:04.166 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:04.169+0000 7f32723d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:04.166 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:04.169+0000 7f32723d98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:04.167 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:04.169+0000 7f32723d98c0 -1 bdev(0x557c244ebc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:08:04.167 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:04.169+0000 7f32723d98c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:08:06.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:08:06.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:08:06.423 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:08:06.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:08:06.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:08:06.625 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:08:06.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:08:06.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:06.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:08:06.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:08:06.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:08:06.645 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:06.645+0000 7f1e50bf68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:06.651 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:06.653+0000 7f1e50bf68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:06.652 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:06.653+0000 7f1e50bf68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:06.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:06.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:07.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:07.601+0000 7f1e50bf68c0 -1 Falling back to public interface 2026-03-08T23:08:07.964 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:08:07.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:07.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:07.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:07.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:07.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:08.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:08.608 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:08.609+0000 7f1e50bf68c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:09.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:10.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:10.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:11.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:11.627 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/947864310,v1:127.0.0.1:6811/947864310] [v2:127.0.0.1:6812/947864310,v1:127.0.0.1:6813/947864310] exists,up 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 2 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:11.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:08:11.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:08:11.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:08:11.630 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:08:11.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:08:11.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 13e2026c-a619-4ed4-ad39-7fc269eaec21' 2026-03-08T23:08:11.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:08:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBbAa5pFw6RJhAAyiOvBXeZdcNdIZc/Rv+5vg== 2026-03-08T23:08:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBbAa5pFw6RJhAAyiOvBXeZdcNdIZc/Rv+5vg=="}' 2026-03-08T23:08:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 13e2026c-a619-4ed4-ad39-7fc269eaec21 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:08:11.813 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:08:11.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:08:11.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBbAa5pFw6RJhAAyiOvBXeZdcNdIZc/Rv+5vg== --osd-uuid 13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:08:11.845 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:11.845+0000 7f1746e9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:11.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:11.849+0000 7f1746e9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:11.848 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:11.849+0000 7f1746e9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:11.848 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:11.849+0000 7f1746e9a8c0 -1 bdev(0x55c0c7cd9c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:08:11.848 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:11.849+0000 7f1746e9a8c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:08:14.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:08:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:08:14.115 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:08:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:08:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:08:14.313 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:08:14.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:08:14.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:14.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:08:14.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:08:14.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:08:14.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:14.333+0000 7f48d7b158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:14.351 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:14.353+0000 7f48d7b158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:14.353 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:14.353+0000 7f48d7b158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:14.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:08:14.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:15.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:08:15.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:15.819 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:15.821+0000 7f48d7b158c0 -1 Falling back to public interface 2026-03-08T23:08:16.790 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:16.789+0000 7f48d7b158c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:08:16.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/4181000935,v1:127.0.0.1:6819/4181000935] [v2:127.0.0.1:6820/4181000935,v1:127.0.0.1:6821/4181000935] exists,up 13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3606: corrupt_scrub_erasure: create_rbd_pool 2026-03-08T23:08:18.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:08:18.256 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:08:18.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:08:18.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:08:18.464 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:08:18.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:08:19.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:08:19.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3607: corrupt_scrub_erasure: create_pool foo 1 2026-03-08T23:08:19.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create foo 1 2026-03-08T23:08:19.967 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' created 2026-03-08T23:08:19.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3609: corrupt_scrub_erasure: create_ec_pool ecpool false k=2 m=1 stripe_unit=2K --force 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:08:20.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 stripe_unit=2K --force 2026-03-08T23:08:21.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:08:21.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:08:21.531 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:08:21.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:22.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:22.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:22.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:22.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:08:22.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:08:22.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:08:22.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:22.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:22.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705668 2026-03-08T23:08:22.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705668 2026-03-08T23:08:22.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705668' 2026-03-08T23:08:22.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:22.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:22.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542147 2026-03-08T23:08:22.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542147 2026-03-08T23:08:22.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705668 2-60129542147' 2026-03-08T23:08:22.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:22.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:08:22.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:22.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:22.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:08:22.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:22.988 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:08:22.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:08:22.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:08:22.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:08:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:24.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:08:24.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:24.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T23:08:24.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:24.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705668 2026-03-08T23:08:24.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:24.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:08:24.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705668 2026-03-08T23:08:24.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:24.300 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705668 2026-03-08T23:08:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705668 2026-03-08T23:08:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705668' 2026-03-08T23:08:24.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:24.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705668 -lt 38654705668 2026-03-08T23:08:24.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:24.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542147 2026-03-08T23:08:24.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:24.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:08:24.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542147 2026-03-08T23:08:24.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:24.454 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542147 2026-03-08T23:08:24.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542147 2026-03-08T23:08:24.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542147' 2026-03-08T23:08:24.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:08:24.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542147 2026-03-08T23:08:24.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:08:24.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:24.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:24.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:08:24.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:08:24.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:08:24.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:08:24.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:08:24.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:08:24.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:08:24.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:08:24.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:08:24.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:08:24.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:24.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3610: corrupt_scrub_erasure: wait_for_clean 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:25.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:25.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:25.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:25.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:08:25.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:08:25.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:08:25.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:25.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:25.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705669 2026-03-08T23:08:25.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705669 2026-03-08T23:08:25.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-38654705669' 2026-03-08T23:08:25.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:25.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542148 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542148 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-38654705669 2-60129542148' 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:08:25.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:25.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:25.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:08:25.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:25.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:08:25.562 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T23:08:25.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:08:25.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836487 2026-03-08T23:08:25.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:26.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:08:26.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:26.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T23:08:26.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:26.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:26.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705669 2026-03-08T23:08:26.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:08:26.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705669 2026-03-08T23:08:26.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:26.883 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705669 2026-03-08T23:08:26.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705669 2026-03-08T23:08:26.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705669' 2026-03-08T23:08:26.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:27.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705670 -lt 38654705669 2026-03-08T23:08:27.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:27.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542148 2026-03-08T23:08:27.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:27.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:08:27.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542148 2026-03-08T23:08:27.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:27.048 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542148 2026-03-08T23:08:27.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542148 2026-03-08T23:08:27.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542148' 2026-03-08T23:08:27.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:08:27.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542148 -lt 60129542148 2026-03-08T23:08:27.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:08:27.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:27.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:27.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:08:27.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:08:27.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:08:27.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:08:27.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:27.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:27.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:08:27.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:08:27.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:08:27.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: seq 1 7 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ1 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ1 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ1 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:08:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:08:27.979 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:08:27.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:08:28.196 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:08:28.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:08:28.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:08:28.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:08:28.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 1 % 2 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3621: corrupt_scrub_erasure: local payload=UVWXYZZZ 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3622: corrupt_scrub_erasure: echo UVWXYZZZ 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3623: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:08:28.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:08:28.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:08:28.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:08:28.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:08:28.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:08:28.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:08:28.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:08:29.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:08:29.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:08:29.527 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:08:29.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:29.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:08:29.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:08:29.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:08:29.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:08:29.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:08:29.544 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:29.541+0000 7f36e1d248c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:29.546 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:29.549+0000 7f36e1d248c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:29.548 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:29.549+0000 7f36e1d248c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:29.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:30.499 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:30.501+0000 7f36e1d248c0 -1 Falling back to public interface 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:30.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:31.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:31.486 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:31.489+0000 7f36e1d248c0 -1 osd.1 33 log_to_monitors true 2026-03-08T23:08:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:32.029 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:08:32.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:32.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:32.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:32.366 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:32.365+0000 7f36d8cd4640 -1 osd.1 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:08:33.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:33.215 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:08:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:33.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:33.385 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 37 up_thru 37 down_at 34 last_clean_interval [9,33) [v2:127.0.0.1:6810/700632434,v1:127.0.0.1:6811/700632434] [v2:127.0.0.1:6812/700632434,v1:127.0.0.1:6813/700632434] exists,up 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:33.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:33.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:33.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:33.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:33.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:33.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:33.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:33.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836489 2026-03-08T23:08:33.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836489 2026-03-08T23:08:33.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489' 2026-03-08T23:08:33.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:33.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:33.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=158913789954 2026-03-08T23:08:33.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 158913789954 2026-03-08T23:08:33.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-158913789954' 2026-03-08T23:08:33.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:33.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542151 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542151 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489 1-158913789954 2-60129542151' 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836489 2026-03-08T23:08:33.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:33.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:33.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836489 2026-03-08T23:08:33.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:33.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836489 2026-03-08T23:08:33.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836489' 2026-03-08T23:08:33.861 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836489 2026-03-08T23:08:33.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:34.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836489 2026-03-08T23:08:34.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:35.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:08:35.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:35.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836489 2026-03-08T23:08:35.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:35.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-158913789954 2026-03-08T23:08:35.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:35.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:08:35.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-158913789954 2026-03-08T23:08:35.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:35.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=158913789954 2026-03-08T23:08:35.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 158913789954' 2026-03-08T23:08:35.213 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 158913789954 2026-03-08T23:08:35.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:35.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 158913789954 -lt 158913789954 2026-03-08T23:08:35.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:35.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542151 2026-03-08T23:08:35.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:35.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:08:35.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542151 2026-03-08T23:08:35.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:35.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542151 2026-03-08T23:08:35.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542151' 2026-03-08T23:08:35.377 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542151 2026-03-08T23:08:35.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:08:35.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542151 -lt 60129542151 2026-03-08T23:08:35.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:08:35.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:35.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:08:35.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:08:35.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:08:35.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:08:35.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:35.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ2 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ2 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ2 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:08:36.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:08:36.299 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:08:36.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:08:36.507 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:08:36.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:08:36.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:08:36.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:08:36.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 2 % 2 2026-03-08T23:08:36.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:08:36.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:08:36.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3628: corrupt_scrub_erasure: dd if=/dev/urandom of=td/osd-scrub-repair/CORRUPT bs=2048 count=1 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:2048 bytes (2.0 kB, 2.0 KiB) copied, 8.6661e-05 s, 23.6 MB/s 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3629: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:08:36.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:08:36.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:08:36.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:08:36.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:08:37.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:08:37.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:37.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:37.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:37.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:08:37.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:08:37.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:08:37.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:08:37.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:08:37.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:37.861+0000 7f8616da18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:37.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:37.861+0000 7f8616da18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:37.863 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:37.865+0000 7f8616da18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:38.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:38.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:38.575 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:38.577+0000 7f8616da18c0 -1 Falling back to public interface 2026-03-08T23:08:39.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:39.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:39.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:39.196 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:39.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:39.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:39.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:39.543 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:39.545+0000 7f8616da18c0 -1 osd.0 40 log_to_monitors true 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:40.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:40.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:40.994 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:40.993+0000 7f860dd51640 -1 osd.0 40 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:41.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 44 up_thru 44 down_at 41 last_clean_interval [5,40) [v2:127.0.0.1:6802/563920110,v1:127.0.0.1:6803/563920110] [v2:127.0.0.1:6804/563920110,v1:127.0.0.1:6805/563920110] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:41.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:41.702 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:41.702 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:41.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:41.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:41.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:41.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:41.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:41.933 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:41.933 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:41.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:41.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:41.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:42.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561026 2026-03-08T23:08:42.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561026 2026-03-08T23:08:42.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026' 2026-03-08T23:08:42.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:42.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:42.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=158913789957 2026-03-08T23:08:42.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 158913789957 2026-03-08T23:08:42.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026 1-158913789957' 2026-03-08T23:08:42.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:42.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542153 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542153 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026 1-158913789957 2-60129542153' 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-188978561026 2026-03-08T23:08:42.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:42.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:42.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-188978561026 2026-03-08T23:08:42.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:42.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561026 2026-03-08T23:08:42.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 188978561026' 2026-03-08T23:08:42.166 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 188978561026 2026-03-08T23:08:42.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:42.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561026 -lt 188978561026 2026-03-08T23:08:42.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:42.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-158913789957 2026-03-08T23:08:42.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:42.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:08:42.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-158913789957 2026-03-08T23:08:42.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:42.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=158913789957 2026-03-08T23:08:42.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 158913789957' 2026-03-08T23:08:42.325 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 158913789957 2026-03-08T23:08:42.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:42.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 158913789956 -lt 158913789957 2026-03-08T23:08:42.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:43.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:08:43.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:43.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 158913789956 -lt 158913789957 2026-03-08T23:08:43.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:44.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:08:44.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:44.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 158913789957 -lt 158913789957 2026-03-08T23:08:44.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:44.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542153 2026-03-08T23:08:44.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:44.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:08:44.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542153 2026-03-08T23:08:44.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:44.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542153 2026-03-08T23:08:44.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542153' 2026-03-08T23:08:44.826 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542153 2026-03-08T23:08:44.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:08:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542154 -lt 60129542153 2026-03-08T23:08:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:08:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:45.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:08:45.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:08:45.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:08:45.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:08:45.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:08:45.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:08:45.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:08:45.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:08:45.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:08:45.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:08:45.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:45.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ3 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ3 2026-03-08T23:08:45.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:08:45.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:08:45.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ3 2026-03-08T23:08:45.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:08:45.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:08:45.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:08:45.753 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:08:45.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:08:45.961 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:08:45.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:08:45.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:08:45.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ3 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:08:45.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 3 % 2 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3634: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:08:45.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:08:46.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ3 remove 2026-03-08T23:08:46.962 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#3:b197b25d:::EOBJ3:head# 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:47.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:08:47.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:47.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:08:47.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:08:47.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:08:47.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:08:47.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:08:47.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:47.513+0000 7fead2b5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:47.513 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:47.513+0000 7fead2b5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:47.517 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:47.517+0000 7fead2b5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:47.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:48.483 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:48.485+0000 7fead2b5a8c0 -1 Falling back to public interface 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:48.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:48.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:49.465 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:49.465+0000 7fead2b5a8c0 -1 osd.1 47 log_to_monitors true 2026-03-08T23:08:49.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:49.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:49.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:49.993 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:08:49.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:49.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:50.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:08:51.346 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 51 up_thru 51 down_at 48 last_clean_interval [37,47) [v2:127.0.0.1:6810/1855475991,v1:127.0.0.1:6811/1855475991] [v2:127.0.0.1:6812/1855475991,v1:127.0.0.1:6813/1855475991] exists,up 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:51.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:51.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:51.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:51.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:51.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:51.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:51.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:51.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:51.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:51.580 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:51.580 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:51.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:51.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:51.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:51.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561029 2026-03-08T23:08:51.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561029 2026-03-08T23:08:51.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029' 2026-03-08T23:08:51.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:51.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:51.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332098 2026-03-08T23:08:51.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332098 2026-03-08T23:08:51.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029 1-219043332098' 2026-03-08T23:08:51.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:51.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542156 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542156 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029 1-219043332098 2-60129542156' 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-188978561029 2026-03-08T23:08:51.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:51.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:51.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-188978561029 2026-03-08T23:08:51.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:51.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561029 2026-03-08T23:08:51.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 188978561029' 2026-03-08T23:08:51.812 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 188978561029 2026-03-08T23:08:51.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:51.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561028 -lt 188978561029 2026-03-08T23:08:51.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:08:52.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:08:52.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:53.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561029 -lt 188978561029 2026-03-08T23:08:53.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:53.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332098 2026-03-08T23:08:53.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:53.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:08:53.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332098 2026-03-08T23:08:53.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:53.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332098 2026-03-08T23:08:53.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332098' 2026-03-08T23:08:53.139 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332098 2026-03-08T23:08:53.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:08:53.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332098 -lt 219043332098 2026-03-08T23:08:53.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:53.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542156 2026-03-08T23:08:53.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:53.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:08:53.302 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542156 2026-03-08T23:08:53.302 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:53.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542156 2026-03-08T23:08:53.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542156' 2026-03-08T23:08:53.303 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542156 2026-03-08T23:08:53.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:08:53.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542156 -lt 60129542156 2026-03-08T23:08:53.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:08:53.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:53.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:08:53.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:08:53.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:08:53.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:08:53.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:08:53.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:08:53.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ4 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ4 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:08:54.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:08:54.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ4 2026-03-08T23:08:54.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:08:54.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:08:54.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:08:54.183 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:08:54.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:08:54.391 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:08:54.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:08:54.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:08:54.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ4 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:08:54.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 4 % 2 2026-03-08T23:08:54.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:08:54.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:08:54.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3638: corrupt_scrub_erasure: rados --pool ecpool setxattr EOBJ4 key1-EOBJ4 val1-EOBJ4 2026-03-08T23:08:54.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3639: corrupt_scrub_erasure: rados --pool ecpool setxattr EOBJ4 key2-EOBJ4 val2-EOBJ4 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3642: corrupt_scrub_erasure: echo -n bad-val 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3643: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:08:54.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:08:54.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:08:54.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:08:54.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:08:54.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:08:55.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:08:55.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:08:55.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:08:55.788 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:08:55.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:08:55.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:08:55.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:08:55.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:08:55.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:08:55.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:08:55.804 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:55.805+0000 7fd68d1c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:55.805 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:55.805+0000 7fd68d1c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:55.810 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:55.809+0000 7fd68d1c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:08:55.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:55.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:56.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:56.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:56.261+0000 7fd68d1c78c0 -1 Falling back to public interface 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:57.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:57.239 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:08:57.241+0000 7fd68d1c78c0 -1 osd.0 54 log_to_monitors true 2026-03-08T23:08:57.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:08:58.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 58 up_thru 58 down_at 55 last_clean_interval [44,54) [v2:127.0.0.1:6802/4282082266,v1:127.0.0.1:6803/4282082266] [v2:127.0.0.1:6804/4282082266,v1:127.0.0.1:6805/4282082266] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:08:58.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:08:58.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:08:58.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:08:58.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:08:58.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:08:58.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:08:58.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:58.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:08:58.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103170 2026-03-08T23:08:58.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103170 2026-03-08T23:08:58.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170' 2026-03-08T23:08:58.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:58.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:08:58.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332100 2026-03-08T23:08:58.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332100 2026-03-08T23:08:58.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170 1-219043332100' 2026-03-08T23:08:58.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:08:58.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542159 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542159 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170 1-219043332100 2-60129542159' 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103170 2026-03-08T23:08:58.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:08:58.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:08:58.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103170 2026-03-08T23:08:58.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:08:58.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103170 2026-03-08T23:08:58.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103170' 2026-03-08T23:08:58.945 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103170 2026-03-08T23:08:58.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:08:59.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 249108103170 2026-03-08T23:08:59.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:00.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:00.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:00.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103170 -lt 249108103170 2026-03-08T23:09:00.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:00.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332100 2026-03-08T23:09:00.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:00.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:00.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332100 2026-03-08T23:09:00.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:00.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332100 2026-03-08T23:09:00.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332100' 2026-03-08T23:09:00.269 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332100 2026-03-08T23:09:00.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:00.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332101 -lt 219043332100 2026-03-08T23:09:00.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:00.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542159 2026-03-08T23:09:00.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:00.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:00.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542159 2026-03-08T23:09:00.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:00.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542159 2026-03-08T23:09:00.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542159' 2026-03-08T23:09:00.434 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542159 2026-03-08T23:09:00.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:00.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542159 -lt 60129542159 2026-03-08T23:09:00.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:00.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:00.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:00.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:00.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:00.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:00.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:00.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3644: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:01.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:09:01.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:02.602 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:02.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:09:02.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:09:02.604 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:09:02.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:02.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:09:02.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:09:02.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:02.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:02.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:02.624 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:02.621+0000 7f47980018c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:02.627 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:02.629+0000 7f47980018c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:02.629 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:02.629+0000 7f47980018c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:02.787 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:02.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:02.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:02.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:03.335 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:03.337+0000 7f47980018c0 -1 Falling back to public interface 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:03.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:04.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:04.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:04.313+0000 7f47980018c0 -1 osd.2 59 log_to_monitors true 2026-03-08T23:09:05.054 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:05.053+0000 7f478efb1640 -1 osd.2 59 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:05.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 63 up_thru 63 down_at 60 last_clean_interval [14,59) [v2:127.0.0.1:6818/2764314051,v1:127.0.0.1:6819/2764314051] [v2:127.0.0.1:6820/2764314051,v1:127.0.0.1:6821/2764314051] exists,up 13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:05.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:05.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:05.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:05.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103172 2026-03-08T23:09:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103172 2026-03-08T23:09:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103172' 2026-03-08T23:09:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:05.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:05.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332103 2026-03-08T23:09:05.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332103 2026-03-08T23:09:05.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103172 1-219043332103' 2026-03-08T23:09:05.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:05.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939650 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939650 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103172 1-219043332103 2-270582939650' 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103172 2026-03-08T23:09:05.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:05.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:05.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103172 2026-03-08T23:09:05.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:05.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103172 2026-03-08T23:09:05.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103172' 2026-03-08T23:09:05.759 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103172 2026-03-08T23:09:05.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:05.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103171 -lt 249108103172 2026-03-08T23:09:05.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:06.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:06.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:07.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103172 -lt 249108103172 2026-03-08T23:09:07.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:07.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332103 2026-03-08T23:09:07.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:07.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:07.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332103 2026-03-08T23:09:07.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:07.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332103 2026-03-08T23:09:07.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332103' 2026-03-08T23:09:07.072 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332103 2026-03-08T23:09:07.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:07.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332103 -lt 219043332103 2026-03-08T23:09:07.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:07.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-270582939650 2026-03-08T23:09:07.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:07.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:07.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-270582939650 2026-03-08T23:09:07.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:07.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939650 2026-03-08T23:09:07.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 270582939650' 2026-03-08T23:09:07.237 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 270582939650 2026-03-08T23:09:07.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:07.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939650 -lt 270582939650 2026-03-08T23:09:07.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:07.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:07.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:07.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:07.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:07.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:07.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:07.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:07.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:07.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3645: corrupt_scrub_erasure: echo -n val3-EOBJ4 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3646: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:07.926 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:07.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:07.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:07.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:09:08.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:09.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:09:09.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:09:09.212 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:09:09.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:09.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:09:09.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:09:09.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:09.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:09.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:09.227 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:09.225+0000 7fe05b2e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:09.227 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:09.229+0000 7fe05b2e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:09.229 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:09.229+0000 7fe05b2e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:09.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:10.183 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:10.185+0000 7fe05b2e08c0 -1 Falling back to public interface 2026-03-08T23:09:10.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:10.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:10.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:10.537 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:10.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:10.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:10.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:11.164 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:11.165+0000 7fe05b2e08c0 -1 osd.2 64 log_to_monitors true 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:11.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:11.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 68 up_thru 68 down_at 65 last_clean_interval [63,64) [v2:127.0.0.1:6818/2021926908,v1:127.0.0.1:6819/2021926908] [v2:127.0.0.1:6820/2021926908,v1:127.0.0.1:6821/2021926908] exists,up 13e2026c-a619-4ed4-ad39-7fc269eaec21 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:13.042 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:13.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:13.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:13.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:13.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:13.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:13.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103175 2026-03-08T23:09:13.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103175 2026-03-08T23:09:13.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175' 2026-03-08T23:09:13.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:13.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:13.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332105 2026-03-08T23:09:13.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332105 2026-03-08T23:09:13.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175 1-219043332105' 2026-03-08T23:09:13.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:13.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776130 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776130 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175 1-219043332105 2-292057776130' 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103175 2026-03-08T23:09:13.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:13.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:13.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103175 2026-03-08T23:09:13.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:13.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103175 2026-03-08T23:09:13.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103175' 2026-03-08T23:09:13.473 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103175 2026-03-08T23:09:13.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:13.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103173 -lt 249108103175 2026-03-08T23:09:13.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:14.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:14.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103175 -lt 249108103175 2026-03-08T23:09:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332105 2026-03-08T23:09:14.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:14.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332105 2026-03-08T23:09:14.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332105 2026-03-08T23:09:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332105' 2026-03-08T23:09:14.796 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332105 2026-03-08T23:09:14.796 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:14.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332105 -lt 219043332105 2026-03-08T23:09:14.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:14.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776130 2026-03-08T23:09:14.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:14.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:14.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776130 2026-03-08T23:09:14.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:14.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776130 2026-03-08T23:09:14.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776130' 2026-03-08T23:09:14.958 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776130 2026-03-08T23:09:14.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:15.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776130 -lt 292057776130 2026-03-08T23:09:15.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:15.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:15.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:15.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:15.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:15.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:15.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:15.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:15.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:15.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:15.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:15.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3647: corrupt_scrub_erasure: rm td/osd-scrub-repair/bad-val td/osd-scrub-repair/newval 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ5 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ5 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ5 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:09:15.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:09:15.856 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:09:15.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:09:16.065 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:09:16.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:09:16.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:09:16.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ5 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:09:16.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 5 % 2 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3652: corrupt_scrub_erasure: dd if=/dev/urandom of=td/osd-scrub-repair/CORRUPT bs=2048 count=2 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records in 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records out 2026-03-08T23:09:16.100 INFO:tasks.workunit.client.0.vm03.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 5.277e-05 s, 77.6 MB/s 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3653: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:16.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:09:16.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:17.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:17.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:09:17.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:09:17.400 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:09:17.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:17.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:09:17.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:09:17.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:17.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:17.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:17.413 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:17.413+0000 7f6d156d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:17.421 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:17.421+0000 7f6d156d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:17.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:17.421+0000 7f6d156d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:17.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:17.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:18.368 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:18.369+0000 7f6d156d48c0 -1 Falling back to public interface 2026-03-08T23:09:18.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:18.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:18.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:18.721 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:18.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:18.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:18.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:19.352 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:19.353+0000 7f6d156d48c0 -1 osd.1 71 log_to_monitors true 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:19.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:20.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:21.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 75 up_thru 75 down_at 72 last_clean_interval [51,71) [v2:127.0.0.1:6810/3299292360,v1:127.0.0.1:6811/3299292360] [v2:127.0.0.1:6812/3299292360,v1:127.0.0.1:6813/3299292360] exists,up 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:21.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:21.219 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:21.219 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:21.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:21.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:21.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:21.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:21.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:21.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:21.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:21.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:21.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:21.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103177 2026-03-08T23:09:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103177 2026-03-08T23:09:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103177' 2026-03-08T23:09:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:21.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:21.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547202 2026-03-08T23:09:21.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547202 2026-03-08T23:09:21.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103177 1-322122547202' 2026-03-08T23:09:21.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:21.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776133 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776133 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103177 1-322122547202 2-292057776133' 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103177 2026-03-08T23:09:21.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:21.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:21.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103177 2026-03-08T23:09:21.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:21.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103177 2026-03-08T23:09:21.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103177' 2026-03-08T23:09:21.681 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103177 2026-03-08T23:09:21.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:21.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103176 -lt 249108103177 2026-03-08T23:09:21.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:22.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:22.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:22.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103177 -lt 249108103177 2026-03-08T23:09:22.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:22.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-322122547202 2026-03-08T23:09:22.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:22.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:22.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-322122547202 2026-03-08T23:09:22.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:22.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547202 2026-03-08T23:09:22.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 322122547202' 2026-03-08T23:09:22.995 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 322122547202 2026-03-08T23:09:22.995 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:23.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547202 -lt 322122547202 2026-03-08T23:09:23.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:23.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776133 2026-03-08T23:09:23.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:23.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:23.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776133 2026-03-08T23:09:23.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776133 2026-03-08T23:09:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776133' 2026-03-08T23:09:23.151 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776133 2026-03-08T23:09:23.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:23.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776133 -lt 292057776133 2026-03-08T23:09:23.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:23.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:23.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:23.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:23.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:23.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:23.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:23.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ6 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ6 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ6 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:09:23.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:09:24.052 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:09:24.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:09:24.260 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:09:24.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:09:24.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:09:24.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ6 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:09:24.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 6 % 2 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3657: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:24.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:24.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:24.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:24.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:24.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:25.586 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:25.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:09:25.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:09:25.588 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:09:25.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:25.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:09:25.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:09:25.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:25.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:25.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:25.603 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:25.601+0000 7fb12721c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:25.611 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:25.609+0000 7fb12721c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:25.612 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:25.613+0000 7fb12721c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:25.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:25.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:26.816 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:26.817+0000 7fb12721c8c0 -1 Falling back to public interface 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:26.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:27.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:27.800 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:27.801+0000 7fb12721c8c0 -1 osd.0 78 log_to_monitors true 2026-03-08T23:09:28.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:28.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:28.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:28.082 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:28.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:28.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:28.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:29.398 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 82 up_thru 82 down_at 79 last_clean_interval [58,78) [v2:127.0.0.1:6802/443027105,v1:127.0.0.1:6803/443027105] [v2:127.0.0.1:6804/443027105,v1:127.0.0.1:6805/443027105] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:29.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:29.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:29.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:29.623 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:29.623 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:29.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:29.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:29.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:29.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318274 2026-03-08T23:09:29.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318274 2026-03-08T23:09:29.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274' 2026-03-08T23:09:29.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:29.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:29.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547205 2026-03-08T23:09:29.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547205 2026-03-08T23:09:29.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274 1-322122547205' 2026-03-08T23:09:29.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:29.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776135 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776135 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274 1-322122547205 2-292057776135' 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318274 2026-03-08T23:09:29.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:29.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:29.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318274 2026-03-08T23:09:29.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:29.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318274 2026-03-08T23:09:29.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318274' 2026-03-08T23:09:29.846 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318274 2026-03-08T23:09:29.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:30.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 352187318274 2026-03-08T23:09:30.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:31.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:31.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:31.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318274 -lt 352187318274 2026-03-08T23:09:31.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:31.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-322122547205 2026-03-08T23:09:31.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:31.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-322122547205 2026-03-08T23:09:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:31.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547205 2026-03-08T23:09:31.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 322122547205' 2026-03-08T23:09:31.161 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 322122547205 2026-03-08T23:09:31.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:31.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547205 -lt 322122547205 2026-03-08T23:09:31.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:31.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776135 2026-03-08T23:09:31.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:31.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:31.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776135 2026-03-08T23:09:31.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:31.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776135 2026-03-08T23:09:31.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776135' 2026-03-08T23:09:31.320 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776135 2026-03-08T23:09:31.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:31.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776135 -lt 292057776135 2026-03-08T23:09:31.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:31.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:31.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:31.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:31.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:31.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:31.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3658: corrupt_scrub_erasure: echo -n bad-val 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3659: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:09:32.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:33.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:33.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:09:33.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:09:33.324 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:09:33.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:33.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:09:33.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:09:33.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:33.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:33.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:33.339 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:33.337+0000 7f0d61ef48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:33.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:33.349+0000 7f0d61ef48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:33.350 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:33.349+0000 7f0d61ef48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:33.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:33.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:33.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:34.308 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:34.309+0000 7f0d61ef48c0 -1 Falling back to public interface 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:34.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:34.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:35.286 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:35.285+0000 7f0d61ef48c0 -1 osd.1 83 log_to_monitors true 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:35.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:36.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:37.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:09:37.174 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 87 up_thru 87 down_at 84 last_clean_interval [75,83) [v2:127.0.0.1:6810/556527254,v1:127.0.0.1:6811/556527254] [v2:127.0.0.1:6812/556527254,v1:127.0.0.1:6813/556527254] exists,up 3d9539b6-0f15-4899-81a5-1a6d617f533b 2026-03-08T23:09:37.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:37.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:37.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:37.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:37.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:37.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:37.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318276 2026-03-08T23:09:37.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318276 2026-03-08T23:09:37.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318276' 2026-03-08T23:09:37.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:37.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:37.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154754 2026-03-08T23:09:37.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154754 2026-03-08T23:09:37.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318276 1-373662154754' 2026-03-08T23:09:37.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:37.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776138 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776138 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318276 1-373662154754 2-292057776138' 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318276 2026-03-08T23:09:37.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:37.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:37.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318276 2026-03-08T23:09:37.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:37.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318276 2026-03-08T23:09:37.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318276' 2026-03-08T23:09:37.628 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318276 2026-03-08T23:09:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:37.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318275 -lt 352187318276 2026-03-08T23:09:37.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:38.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:38.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318277 -lt 352187318276 2026-03-08T23:09:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:38.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154754 2026-03-08T23:09:38.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:38.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:38.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154754 2026-03-08T23:09:38.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:38.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154754 2026-03-08T23:09:38.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154754' 2026-03-08T23:09:38.951 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154754 2026-03-08T23:09:38.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:39.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154754 -lt 373662154754 2026-03-08T23:09:39.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:39.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776138 2026-03-08T23:09:39.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:39.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:39.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776138 2026-03-08T23:09:39.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:39.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776138 2026-03-08T23:09:39.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776138' 2026-03-08T23:09:39.112 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776138 2026-03-08T23:09:39.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:39.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776138 -lt 292057776138 2026-03-08T23:09:39.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:39.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:39.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:39.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:39.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:39.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:39.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:39.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:39.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ7 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ7 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ7 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:09:39.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:09:40.023 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:09:40.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:09:40.233 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:09:40.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:09:40.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:09:40.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ7 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:09:40.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 7 % 2 2026-03-08T23:09:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:09:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:09:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3663: corrupt_scrub_erasure: local payload=MAKETHISDIFFERENTFROMOTHEROBJECTS 2026-03-08T23:09:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3664: corrupt_scrub_erasure: echo MAKETHISDIFFERENTFROMOTHEROBJECTS 2026-03-08T23:09:40.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3665: corrupt_scrub_erasure: rados --pool ecpool put EOBJ7 td/osd-scrub-repair/DIFFERENT 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3668: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:40.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:09:40.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:40.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ1 get-attr hinfo_key 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:41.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:09:41.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:09:41.009 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:09:41.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:41.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:09:41.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:09:41.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:41.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:41.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:41.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:41.025+0000 7fac715d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:41.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:41.029+0000 7fac715d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:41.029 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:41.029+0000 7fac715d58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:41.185 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:41.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:41.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:41.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:41.740 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:41.741+0000 7fac715d58c0 -1 Falling back to public interface 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:42.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:42.758 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:42.757+0000 7fac715d58c0 -1 osd.0 90 log_to_monitors true 2026-03-08T23:09:43.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:43.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:43.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:43.520 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:43.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:43.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:43.705 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 94 up_thru 94 down_at 91 last_clean_interval [82,90) [v2:127.0.0.1:6802/642665159,v1:127.0.0.1:6803/642665159] [v2:127.0.0.1:6804/642665159,v1:127.0.0.1:6805/642665159] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:43.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:43.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:43.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:43.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:43.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:44.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=403726925826 2026-03-08T23:09:44.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 403726925826 2026-03-08T23:09:44.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826' 2026-03-08T23:09:44.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:44.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:44.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154756 2026-03-08T23:09:44.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154756 2026-03-08T23:09:44.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826 1-373662154756' 2026-03-08T23:09:44.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:44.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:44.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776140 2026-03-08T23:09:44.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776140 2026-03-08T23:09:44.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826 1-373662154756 2-292057776140' 2026-03-08T23:09:44.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:44.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-403726925826 2026-03-08T23:09:44.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:44.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:44.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-403726925826 2026-03-08T23:09:44.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:44.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=403726925826 2026-03-08T23:09:44.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 403726925826' 2026-03-08T23:09:44.181 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 403726925826 2026-03-08T23:09:44.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:44.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 403726925826 -lt 403726925826 2026-03-08T23:09:44.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:44.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154756 2026-03-08T23:09:44.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:44.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:44.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154756 2026-03-08T23:09:44.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154756 2026-03-08T23:09:44.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154756' 2026-03-08T23:09:44.349 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154756 2026-03-08T23:09:44.349 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:44.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154755 -lt 373662154756 2026-03-08T23:09:44.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:45.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:45.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:45.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154755 -lt 373662154756 2026-03-08T23:09:45.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:46.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:09:46.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:46.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154757 -lt 373662154756 2026-03-08T23:09:46.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:46.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776140 2026-03-08T23:09:46.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:46.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:46.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776140 2026-03-08T23:09:46.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:46.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776140 2026-03-08T23:09:46.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776140' 2026-03-08T23:09:46.846 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776140 2026-03-08T23:09:46.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:47.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776140 -lt 292057776140 2026-03-08T23:09:47.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:47.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:47.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:47.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:47.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:47.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:47.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:47.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:47.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3669: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:09:47.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:47.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:09:48.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:09:48.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:09:48.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:09:48.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:09:48.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:09:48.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:09:48.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:09:48.844 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:09:48.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:09:48.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:09:48.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:09:48.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:09:48.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:09:48.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:09:48.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:48.861+0000 7fd45b1458c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:48.861 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:48.861+0000 7fd45b1458c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:48.862 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:48.861+0000 7fd45b1458c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:09:49.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:49.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:49.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:49.832 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:49.833+0000 7fd45b1458c0 -1 Falling back to public interface 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:50.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:50.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:50.804 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:09:50.805+0000 7fd45b1458c0 -1 osd.0 95 log_to_monitors true 2026-03-08T23:09:51.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:51.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:51.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:09:51.362 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:09:51.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:51.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:51.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:09:52.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:09:52.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:09:52.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:09:52.528 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:09:52.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:09:52.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 99 up_thru 99 down_at 96 last_clean_interval [94,95) [v2:127.0.0.1:6802/2496719013,v1:127.0.0.1:6803/2496719013] [v2:127.0.0.1:6804/2496719013,v1:127.0.0.1:6805/2496719013] exists,up c4e8c760-134f-405e-a9e9-2b0353c4d3de 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:52.696 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:52.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:52.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:52.697 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:52.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:52.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:52.919 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:52.920 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:52.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:52.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:52.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:52.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762306 2026-03-08T23:09:52.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762306 2026-03-08T23:09:52.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306' 2026-03-08T23:09:52.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:52.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:53.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154759 2026-03-08T23:09:53.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154759 2026-03-08T23:09:53.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306 1-373662154759' 2026-03-08T23:09:53.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:53.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:53.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776143 2026-03-08T23:09:53.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776143 2026-03-08T23:09:53.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306 1-373662154759 2-292057776143' 2026-03-08T23:09:53.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:53.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762306 2026-03-08T23:09:53.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:53.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:53.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762306 2026-03-08T23:09:53.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:53.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762306 2026-03-08T23:09:53.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762306' 2026-03-08T23:09:53.162 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 425201762306 2026-03-08T23:09:53.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 425201762306 2026-03-08T23:09:53.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:54.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:54.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762306 -lt 425201762306 2026-03-08T23:09:54.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:54.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154759 2026-03-08T23:09:54.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:54.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:54.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154759 2026-03-08T23:09:54.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:54.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154759 2026-03-08T23:09:54.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154759' 2026-03-08T23:09:54.493 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154759 2026-03-08T23:09:54.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:54.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154759 -lt 373662154759 2026-03-08T23:09:54.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:54.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776143 2026-03-08T23:09:54.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:54.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:54.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776143 2026-03-08T23:09:54.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:54.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776143 2026-03-08T23:09:54.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776143' 2026-03-08T23:09:54.663 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776143 2026-03-08T23:09:54.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:54.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776143 -lt 292057776143 2026-03-08T23:09:54.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:09:54.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:54.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:55.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:09:55.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:09:55.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:09:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:09:55.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:09:55.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:09:55.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:09:55.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:09:55.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:09:55.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:09:55.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3670: corrupt_scrub_erasure: rm -f td/osd-scrub-repair/hinfo 2026-03-08T23:09:55.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3676: corrupt_scrub_erasure: get_pg ecpool EOBJ0 2026-03-08T23:09:55.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:09:55.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=EOBJ0 2026-03-08T23:09:55.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool EOBJ0 2026-03-08T23:09:55.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3676: corrupt_scrub_erasure: local pg=3.0 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3678: corrupt_scrub_erasure: pg_scrub 3.0 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=3.0 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 3.0 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:09:55.571 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:09:55.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:09:55.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:09:55.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:09:55.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:55.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:09:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762308 2026-03-08T23:09:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762308 2026-03-08T23:09:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762308' 2026-03-08T23:09:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:55.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:09:56.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154761 2026-03-08T23:09:56.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154761 2026-03-08T23:09:56.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762308 1-373662154761' 2026-03-08T23:09:56.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:09:56.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776144 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776144 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762308 1-373662154761 2-292057776144' 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762308 2026-03-08T23:09:56.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:56.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:09:56.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762308 2026-03-08T23:09:56.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:56.130 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 425201762308 2026-03-08T23:09:56.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762308 2026-03-08T23:09:56.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762308' 2026-03-08T23:09:56.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:09:56.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762308 -lt 425201762308 2026-03-08T23:09:56.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:56.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154761 2026-03-08T23:09:56.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:56.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:09:56.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154761 2026-03-08T23:09:56.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:56.298 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 373662154761 2026-03-08T23:09:56.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154761 2026-03-08T23:09:56.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154761' 2026-03-08T23:09:56.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:56.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154760 -lt 373662154761 2026-03-08T23:09:56.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:57.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:09:57.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:57.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154760 -lt 373662154761 2026-03-08T23:09:57.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:09:58.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:09:58.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:09:58.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154761 -lt 373662154761 2026-03-08T23:09:58.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:09:58.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776144 2026-03-08T23:09:58.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:09:58.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:09:58.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776144 2026-03-08T23:09:58.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:09:58.795 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 292057776144 2026-03-08T23:09:58.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776144 2026-03-08T23:09:58.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776144' 2026-03-08T23:09:58.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776145 -lt 292057776144 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:09:58.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 3.0 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:09:59.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:09:59.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:09:59.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:09:59.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:08:21.526533+0000 2026-03-08T23:09:59.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 3.0 2026-03-08T23:09:59.329 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0s0 on osd.1 to scrub 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 3.0 2026-03-08T23:08:21.526533+0000 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:08:21.526533+0000 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:09:59.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:09:59.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:09:59.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:09:59.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:09:59.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:09:59.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:09:59.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:00.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:10:00.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:00.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:01.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:10:01.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:01.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:10:02.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:09:59.412143+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:02.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:10:02.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3680: corrupt_scrub_erasure: rados list-inconsistent-pg ecpool 2026-03-08T23:10:02.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3682: corrupt_scrub_erasure: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:10:02.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3682: corrupt_scrub_erasure: test 1 = 1 2026-03-08T23:10:02.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3684: corrupt_scrub_erasure: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:10:03.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3684: corrupt_scrub_erasure: test 3.0 = 3.0 2026-03-08T23:10:03.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3686: corrupt_scrub_erasure: rados list-inconsistent-obj 3.0 2026-03-08T23:10:03.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3688: corrupt_scrub_erasure: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3688: corrupt_scrub_erasure: epoch=99 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:10:03.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:10:03.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:10:03.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:10:03.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4311: corrupt_scrub_erasure: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4312: corrupt_scrub_erasure: test no = yes 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4317: corrupt_scrub_erasure: test '' = yes 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4322: corrupt_scrub_erasure: pg_deep_scrub 3.0 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=3.0 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 3.0 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:10:03.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:10:03.069 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:10:03.069 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:10:03.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:10:03.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:10:03.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:10:03.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:03.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:10:03.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762310 2026-03-08T23:10:03.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762310 2026-03-08T23:10:03.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310' 2026-03-08T23:10:03.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:03.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:10:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154763 2026-03-08T23:10:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154763 2026-03-08T23:10:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310 1-373662154763' 2026-03-08T23:10:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:03.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776147 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776147 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310 1-373662154763 2-292057776147' 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762310 2026-03-08T23:10:03.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:03.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:10:03.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762310 2026-03-08T23:10:03.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:03.574 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 425201762310 2026-03-08T23:10:03.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762310 2026-03-08T23:10:03.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762310' 2026-03-08T23:10:03.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:03.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762309 -lt 425201762310 2026-03-08T23:10:03.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:04.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:10:04.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:04.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762310 -lt 425201762310 2026-03-08T23:10:04.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:04.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154763 2026-03-08T23:10:04.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:04.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:10:04.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154763 2026-03-08T23:10:04.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:04.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154763 2026-03-08T23:10:04.886 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 373662154763 2026-03-08T23:10:04.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154763' 2026-03-08T23:10:04.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:10:05.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154763 -lt 373662154763 2026-03-08T23:10:05.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:05.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776147 2026-03-08T23:10:05.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:05.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:10:05.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776147 2026-03-08T23:10:05.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:05.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776147 2026-03-08T23:10:05.043 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 292057776147 2026-03-08T23:10:05.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776147' 2026-03-08T23:10:05.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:10:05.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776147 -lt 292057776147 2026-03-08T23:10:05.196 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:10:05.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean+inconsistent 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean+inconsistent == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:05.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:05.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:05.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 3.0 2026-03-08T23:10:05.562 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0s0 on osd.1 to deep-scrub 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 3.0 2026-03-08T23:08:21.526533+0000 last_deep_scrub_stamp 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:05.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:05.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:05.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:05.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:05.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:05.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:05.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:06.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:06.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:07.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:08.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:08.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:09.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:09.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:09.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:10.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:10.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:10.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:10.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:10.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:10.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:10.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:10.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:10.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:11.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:11.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:11.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:08:21.526533+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:11.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:10:12.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:10:12.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:06.399690+0000 '>' 2026-03-08T23:08:21.526533+0000 2026-03-08T23:10:12.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:10:12.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4324: corrupt_scrub_erasure: rados list-inconsistent-pg ecpool 2026-03-08T23:10:12.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4326: corrupt_scrub_erasure: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:10:12.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4326: corrupt_scrub_erasure: test 1 = 1 2026-03-08T23:10:12.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4328: corrupt_scrub_erasure: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:10:12.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4328: corrupt_scrub_erasure: test 3.0 = 3.0 2026-03-08T23:10:12.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4330: corrupt_scrub_erasure: rados list-inconsistent-obj 3.0 2026-03-08T23:10:12.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4332: corrupt_scrub_erasure: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4332: corrupt_scrub_erasure: epoch=99 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4334: corrupt_scrub_erasure: '[' false = true ']' 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4991: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:10:12.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4991: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:10:12.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4991: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:10:12.744 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:10:12.745 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:10:12.745 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:10:12.745 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:10:12.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:10:12.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:10:12.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5725: corrupt_scrub_erasure: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:10:12.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:10:12.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5726: corrupt_scrub_erasure: test no = yes 2026-03-08T23:10:12.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5737: corrupt_scrub_erasure: test '' = yes 2026-03-08T23:10:12.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5742: corrupt_scrub_erasure: ceph osd pool rm ecpool ecpool --yes-i-really-really-mean-it 2026-03-08T23:10:12.965 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' removed 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:10:12.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:10:13.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:10:13.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:10:13.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:10:13.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:10:13.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:10:13.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:10:13.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:10:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:10:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:10:13.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:10:13.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:10:13.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:10:13.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:10:13.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:10:13.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:10:13.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:10:13.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:10:13.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:10:13.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:10:13.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:10:13.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:10:13.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:10:13.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:10:13.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:10:13.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:10:13.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:10:13.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:10:13.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:10:13.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:10:13.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:10:13.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:10:13.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:10:13.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:10:13.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:10:13.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:10:13.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:10:13.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:10:13.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:10:13.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:10:13.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:10:13.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_scrub_erasure_overwrites td/osd-scrub-repair 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5750: TEST_corrupt_scrub_erasure_overwrites: '[' true = true ']' 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5751: TEST_corrupt_scrub_erasure_overwrites: corrupt_scrub_erasure td/osd-scrub-repair true 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3596: corrupt_scrub_erasure: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3597: corrupt_scrub_erasure: local allow_overwrites=true 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3598: corrupt_scrub_erasure: local poolname=ecpool 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3599: corrupt_scrub_erasure: local total_objs=7 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3601: corrupt_scrub_erasure: run_mon td/osd-scrub-repair a 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:10:13.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.157 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:13.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:10:13.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:10:13.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:10:13.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.183 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:10:13.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:10:13.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:10:13.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:10:13.250 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:10:13.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3602: corrupt_scrub_erasure: run_mgr td/osd-scrub-repair x 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:10:13.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:13.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:10:13.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:10:13.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: seq 0 2 2026-03-08T23:10:13.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 0 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:10:13.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:10:13.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:10:13.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:10:13.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:10:13.446 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:10:13.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 e680aa22-886e-4c4b-92d7-00ecd80dc50c' 2026-03-08T23:10:13.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:10:13.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDVAa5pC16MGxAAxLdw9SJEcmJU8VFBRQo+sQ== 2026-03-08T23:10:13.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDVAa5pC16MGxAAxLdw9SJEcmJU8VFBRQo+sQ=="}' 2026-03-08T23:10:13.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e680aa22-886e-4c4b-92d7-00ecd80dc50c -i td/osd-scrub-repair/0/new.json 2026-03-08T23:10:13.554 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:10:13.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:10:13.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDVAa5pC16MGxAAxLdw9SJEcmJU8VFBRQo+sQ== --osd-uuid e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:10:13.584 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:13.584+0000 7f16ab8928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:13.587 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:13.588+0000 7f16ab8928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:13.589 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:13.588+0000 7f16ab8928c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:13.589 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:13.588+0000 7f16ab8928c0 -1 bdev(0x55d765cf8c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:10:13.589 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:13.588+0000 7f16ab8928c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:10:15.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:10:15.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:10:15.860 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:10:15.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:10:15.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:10:16.051 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:10:16.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:10:16.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:10:16.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:10:16.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:10:16.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:10:16.067 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:16.064+0000 7f51520ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:16.067 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:16.068+0000 7f51520ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:16.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:16.068+0000 7f51520ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:16.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:17.362 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:10:17.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:10:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:17.520 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:17.520+0000 7f51520ce8c0 -1 Falling back to public interface 2026-03-08T23:10:17.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:18.487 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:18.488+0000 7f51520ce8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:10:18.530 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:10:18.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:18.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:18.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:10:18.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:18.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:18.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:19.701 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:10:19.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:19.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:19.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:10:19.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:19.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2470029396,v1:127.0.0.1:6803/2470029396] [v2:127.0.0.1:6804/2470029396,v1:127.0.0.1:6805/2470029396] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 1 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:19.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:10:19.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:10:19.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:10:19.866 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:10:19.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:10:19.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 55610426-1f0a-4abd-b96b-732daaa923cb' 2026-03-08T23:10:19.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:10:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDbAa5ppUWRNBAA19cnonMVdfkq2XHyvv2BAg== 2026-03-08T23:10:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDbAa5ppUWRNBAA19cnonMVdfkq2XHyvv2BAg=="}' 2026-03-08T23:10:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 55610426-1f0a-4abd-b96b-732daaa923cb -i td/osd-scrub-repair/1/new.json 2026-03-08T23:10:20.039 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:10:20.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:10:20.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDbAa5ppUWRNBAA19cnonMVdfkq2XHyvv2BAg== --osd-uuid 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:10:20.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:20.068+0000 7f207fb7f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:20.070 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:20.068+0000 7f207fb7f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:20.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:20.072+0000 7f207fb7f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:20.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:20.072+0000 7f207fb7f8c0 -1 bdev(0x5628f0d41c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:10:20.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:20.072+0000 7f207fb7f8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:10:22.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:10:22.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:10:22.308 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:10:22.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:10:22.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:10:22.502 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:10:22.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:10:22.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:10:22.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:10:22.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:10:22.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:10:22.516 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:22.516+0000 7f5faa2428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:22.524 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:22.524+0000 7f5faa2428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:22.526 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:22.524+0000 7f5faa2428c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:22.673 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:10:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:10:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:10:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:10:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:10:22.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:10:22.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:22.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:10:22.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:22.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:22.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:23.488 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:23.488+0000 7f5faa2428c0 -1 Falling back to public interface 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:23.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:23.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:24.459 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:24.460+0000 7f5faa2428c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:24.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:25.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:26.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:26.315 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/4070626999,v1:127.0.0.1:6811/4070626999] [v2:127.0.0.1:6812/4070626999,v1:127.0.0.1:6813/4070626999] exists,up 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3603: corrupt_scrub_erasure: for id in $(seq 0 2) 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3604: corrupt_scrub_erasure: run_osd td/osd-scrub-repair 2 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:10:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:10:26.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:10:26.319 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:10:26.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:10:26.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 1b9d9246-36ef-44da-b279-b56a967bacd4' 2026-03-08T23:10:26.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:10:26.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDiAa5pOlLqExAAd4BQgngpk7eYppr5hbZTbQ== 2026-03-08T23:10:26.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDiAa5pOlLqExAAd4BQgngpk7eYppr5hbZTbQ=="}' 2026-03-08T23:10:26.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 1b9d9246-36ef-44da-b279-b56a967bacd4 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:10:26.481 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:10:26.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:10:26.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDiAa5pOlLqExAAd4BQgngpk7eYppr5hbZTbQ== --osd-uuid 1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:10:26.509 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:26.508+0000 7f12eb5898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:26.511 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:26.512+0000 7f12eb5898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:26.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:26.512+0000 7f12eb5898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:26.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:26.512+0000 7f12eb5898c0 -1 bdev(0x560c2f79fc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:10:26.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:26.512+0000 7f12eb5898c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:10:29.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:10:29.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:10:29.248 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:10:29.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:10:29.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:10:29.437 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:10:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:10:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:10:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:10:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:10:29.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:10:29.451 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:29.448+0000 7f2024c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:29.459 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:29.460+0000 7f2024c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:29.460 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:29.460+0000 7f2024c968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:29.603 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:29.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:10:29.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:30.769 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:10:30.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:30.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:30.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:10:30.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:30.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:10:30.888 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:30.888+0000 7f2024c968c0 -1 Falling back to public interface 2026-03-08T23:10:30.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:31.860 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:31.860+0000 7f2024c968c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:31.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:10:32.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:33.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/428107751,v1:127.0.0.1:6819/428107751] [v2:127.0.0.1:6820/428107751,v1:127.0.0.1:6821/428107751] exists,up 1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3606: corrupt_scrub_erasure: create_rbd_pool 2026-03-08T23:10:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:10:33.429 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:10:33.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:10:33.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:10:33.652 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:10:33.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:10:34.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:10:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3607: corrupt_scrub_erasure: create_pool foo 1 2026-03-08T23:10:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create foo 1 2026-03-08T23:10:35.160 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' created 2026-03-08T23:10:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:10:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3609: corrupt_scrub_erasure: create_ec_pool ecpool true k=2 m=1 stripe_unit=2K --force 2026-03-08T23:10:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:10:36.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:10:36.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T23:10:36.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:10:36.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 stripe_unit=2K --force 2026-03-08T23:10:36.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:10:36.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:10:36.688 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:10:36.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:10:37.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T23:10:37.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T23:10:37.905 INFO:tasks.workunit.client.0.vm03.stderr:set pool 3 allow_ec_overwrites to true 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:10:37.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:10:37.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:38.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:10:38.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:10:38.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:10:38.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:10:38.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:38.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:10:38.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T23:10:38.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T23:10:38.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T23:10:38.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:38.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542147 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542147 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964 2-60129542147' 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:10:38.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:38.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:10:38.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:10:38.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:38.338 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:10:38.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:10:38.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:10:38.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:38.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:10:38.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:39.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:10:39.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:39.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:10:39.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:40.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:10:40.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:40.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836485 2026-03-08T23:10:40.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:40.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T23:10:40.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:40.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:10:40.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T23:10:40.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:40.813 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T23:10:40.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T23:10:40.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T23:10:40.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:10:40.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T23:10:40.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:40.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542147 2026-03-08T23:10:40.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:40.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:10:40.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542147 2026-03-08T23:10:40.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:40.977 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542147 2026-03-08T23:10:40.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542147 2026-03-08T23:10:40.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542147' 2026-03-08T23:10:40.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:10:41.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542147 2026-03-08T23:10:41.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:10:41.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:41.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:41.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:10:41.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:10:41.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:10:41.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:10:41.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:10:41.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:41.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3610: corrupt_scrub_erasure: wait_for_clean 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:10:41.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:10:41.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:41.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:10:41.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:10:41.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:10:41.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:10:41.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:41.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:10:42.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672966 2026-03-08T23:10:42.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672966 2026-03-08T23:10:42.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672966' 2026-03-08T23:10:42.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:42.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:10:42.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542148 2026-03-08T23:10:42.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542148 2026-03-08T23:10:42.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672966 2-60129542148' 2026-03-08T23:10:42.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:42.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:10:42.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:42.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:10:42.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:10:42.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:42.105 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T23:10:42.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:10:42.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:10:42.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:42.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:10:42.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:43.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:10:43.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:43.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:10:43.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:44.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:10:44.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:44.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T23:10:44.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:44.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672966 2026-03-08T23:10:44.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:44.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:10:44.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672966 2026-03-08T23:10:44.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:44.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672966 2026-03-08T23:10:44.598 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672966 2026-03-08T23:10:44.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672966' 2026-03-08T23:10:44.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:10:44.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672966 2026-03-08T23:10:44.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:44.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542148 2026-03-08T23:10:44.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:44.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:10:44.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542148 2026-03-08T23:10:44.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:44.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542148 2026-03-08T23:10:44.754 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542148 2026-03-08T23:10:44.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542148' 2026-03-08T23:10:44.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:10:44.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542149 -lt 60129542148 2026-03-08T23:10:44.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:10:44.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:44.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:10:45.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:10:45.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:10:45.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:10:45.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:45.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:45.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:10:45.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:10:45.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:10:45.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: seq 1 7 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ1 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ1 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ1 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:10:45.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:10:45.645 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:10:45.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:10:45.853 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:10:45.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:10:45.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:10:45.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:10:45.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 1 % 2 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3621: corrupt_scrub_erasure: local payload=UVWXYZZZ 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3622: corrupt_scrub_erasure: echo UVWXYZZZ 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3623: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:45.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:10:45.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:10:46.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:10:46.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:47.383 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:10:47.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:10:47.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:10:47.385 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:10:47.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:10:47.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:10:47.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:10:47.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:10:47.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:10:47.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:10:47.403 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:47.400+0000 7f90468da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:47.403 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:47.404+0000 7f90468da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:47.404 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:47.404+0000 7f90468da8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:47.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:47.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:48.604 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:48.604+0000 7f90468da8c0 -1 Falling back to public interface 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:48.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:48.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:49.584 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:49.584+0000 7f90468da8c0 -1 osd.1 32 log_to_monitors true 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:49.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:50.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:50.342 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:50.340+0000 7f903d88a640 -1 osd.1 32 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:51.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 36 up_thru 36 down_at 33 last_clean_interval [10,32) [v2:127.0.0.1:6810/2200052931,v1:127.0.0.1:6811/2200052931] [v2:127.0.0.1:6812/2200052931,v1:127.0.0.1:6813/2200052931] exists,up 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:10:51.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:10:51.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:10:51.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:10:51.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:10:51.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:10:51.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:10:51.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:51.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:10:51.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836490 2026-03-08T23:10:51.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836490 2026-03-08T23:10:51.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490' 2026-03-08T23:10:51.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:51.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:10:51.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822658 2026-03-08T23:10:51.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822658 2026-03-08T23:10:51.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-154618822658' 2026-03-08T23:10:51.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:51.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542151 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542151 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-154618822658 2-60129542151' 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836490 2026-03-08T23:10:51.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:51.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:10:51.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836490 2026-03-08T23:10:51.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:51.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836490 2026-03-08T23:10:51.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836490' 2026-03-08T23:10:51.633 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836490 2026-03-08T23:10:51.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:51.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836490 2026-03-08T23:10:51.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:10:52.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:10:52.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:52.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836490 2026-03-08T23:10:52.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:52.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-154618822658 2026-03-08T23:10:52.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:52.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:10:52.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-154618822658 2026-03-08T23:10:52.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:52.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822658 2026-03-08T23:10:52.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 154618822658' 2026-03-08T23:10:52.954 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 154618822658 2026-03-08T23:10:52.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:10:53.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822658 -lt 154618822658 2026-03-08T23:10:53.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:53.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542151 2026-03-08T23:10:53.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:53.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:10:53.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542151 2026-03-08T23:10:53.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:53.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542151 2026-03-08T23:10:53.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542151' 2026-03-08T23:10:53.118 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542151 2026-03-08T23:10:53.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:10:53.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542151 -lt 60129542151 2026-03-08T23:10:53.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:10:53.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:53.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:10:53.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:10:53.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:10:53.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:10:53.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:10:53.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ2 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ2 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:10:53.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ2 2026-03-08T23:10:53.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:10:53.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:10:53.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:10:54.016 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:10:54.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:10:54.221 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:10:54.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:10:54.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:10:54.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:10:54.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 2 % 2 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3628: corrupt_scrub_erasure: dd if=/dev/urandom of=td/osd-scrub-repair/CORRUPT bs=2048 count=1 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:2048 bytes (2.0 kB, 2.0 KiB) copied, 6.1786e-05 s, 33.1 MB/s 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3629: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:10:54.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:10:54.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:10:54.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:10:54.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:10:54.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:10:54.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:10:54.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:10:55.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:10:55.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:10:55.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:10:55.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:10:55.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:10:55.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:10:55.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:10:55.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:10:55.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:10:55.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:10:55.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:10:55.569 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:55.568+0000 7f15b05c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:55.569 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:55.568+0000 7f15b05c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:55.570 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:55.568+0000 7f15b05c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:55.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:55.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:56.780 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:56.780+0000 7f15b05c88c0 -1 Falling back to public interface 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:56.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:57.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:57.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:10:57.760+0000 7f15b05c88c0 -1 osd.0 40 log_to_monitors true 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:58.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:58.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:10:59.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:10:59.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:10:59.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:10:59.238 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:10:59.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:10:59.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 44 up_thru 44 down_at 41 last_clean_interval [5,40) [v2:127.0.0.1:6802/281452896,v1:127.0.0.1:6803/281452896] [v2:127.0.0.1:6804/281452896,v1:127.0.0.1:6805/281452896] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:10:59.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:10:59.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:10:59.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:10:59.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561026 2026-03-08T23:10:59.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561026 2026-03-08T23:10:59.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026' 2026-03-08T23:10:59.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:59.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:10:59.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=154618822661 2026-03-08T23:10:59.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 154618822661 2026-03-08T23:10:59.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026 1-154618822661' 2026-03-08T23:10:59.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:10:59.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542154 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542154 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561026 1-154618822661 2-60129542154' 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-188978561026 2026-03-08T23:10:59.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:59.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:10:59.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-188978561026 2026-03-08T23:10:59.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:10:59.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561026 2026-03-08T23:10:59.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 188978561026' 2026-03-08T23:10:59.839 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 188978561026 2026-03-08T23:10:59.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:10:59.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561026 -lt 188978561026 2026-03-08T23:10:59.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:10:59.995 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:10:59.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-154618822661 2026-03-08T23:11:00.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:00.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:00.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-154618822661 2026-03-08T23:11:00.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=154618822661 2026-03-08T23:11:00.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 154618822661' 2026-03-08T23:11:00.003 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 154618822661 2026-03-08T23:11:00.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:00.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822660 -lt 154618822661 2026-03-08T23:11:00.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:01.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:01.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:01.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822660 -lt 154618822661 2026-03-08T23:11:01.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:02.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:11:02.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:02.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 154618822661 -lt 154618822661 2026-03-08T23:11:02.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:02.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:02.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542154 2026-03-08T23:11:02.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:02.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542154 2026-03-08T23:11:02.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:02.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542154 2026-03-08T23:11:02.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542154' 2026-03-08T23:11:02.493 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542154 2026-03-08T23:11:02.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:02.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542154 -lt 60129542154 2026-03-08T23:11:02.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:02.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:02.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:02.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:03.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:03.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:03.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:03.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:03.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ3 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ3 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ3 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:11:03.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:11:03.409 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:11:03.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:11:03.618 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:11:03.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:11:03.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:11:03.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ3 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 3 % 2 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3634: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ3 remove 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:03.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ3 remove 2026-03-08T23:11:04.622 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#3:b197b25d:::EOBJ3:head# 2026-03-08T23:11:05.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:05.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:05.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:11:05.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:11:05.165 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:11:05.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:05.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:11:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:11:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:05.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:05.182 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:05.180+0000 7f7687c388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:05.182 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:05.180+0000 7f7687c388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:05.183 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:05.180+0000 7f7687c388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:05.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:05.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:06.140 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:06.140+0000 7f7687c388c0 -1 Falling back to public interface 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:06.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:06.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:07.116 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:07.116+0000 7f7687c388c0 -1 osd.1 47 log_to_monitors true 2026-03-08T23:11:07.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:07.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:07.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:07.669 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:07.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:07.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:07.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:07.913 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:07.912+0000 7f767ebe8640 -1 osd.1 47 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 51 up_thru 51 down_at 48 last_clean_interval [36,47) [v2:127.0.0.1:6810/1768989436,v1:127.0.0.1:6811/1768989436] [v2:127.0.0.1:6812/1768989436,v1:127.0.0.1:6813/1768989436] exists,up 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:09.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:09.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:09.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:09.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561029 2026-03-08T23:11:09.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561029 2026-03-08T23:11:09.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029' 2026-03-08T23:11:09.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:09.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332098 2026-03-08T23:11:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332098 2026-03-08T23:11:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029 1-219043332098' 2026-03-08T23:11:09.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:09.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542157 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542157 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-188978561029 1-219043332098 2-60129542157' 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-188978561029 2026-03-08T23:11:09.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:09.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:09.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-188978561029 2026-03-08T23:11:09.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:09.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561029 2026-03-08T23:11:09.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 188978561029' 2026-03-08T23:11:09.467 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 188978561029 2026-03-08T23:11:09.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:09.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561027 -lt 188978561029 2026-03-08T23:11:09.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:10.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:10.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:10.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561029 -lt 188978561029 2026-03-08T23:11:10.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:10.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332098 2026-03-08T23:11:10.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:10.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:10.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332098 2026-03-08T23:11:10.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:10.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332098 2026-03-08T23:11:10.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332098' 2026-03-08T23:11:10.787 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332098 2026-03-08T23:11:10.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:10.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332098 -lt 219043332098 2026-03-08T23:11:10.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:10.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542157 2026-03-08T23:11:10.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542157 2026-03-08T23:11:10.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:10.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542157 2026-03-08T23:11:10.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542157' 2026-03-08T23:11:10.947 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542157 2026-03-08T23:11:10.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:11.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542157 -lt 60129542157 2026-03-08T23:11:11.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:11.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:11.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:11.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:11.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:11.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:11.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:11.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ4 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ4 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:11:11.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ4 2026-03-08T23:11:11.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:11:11.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:11:11.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:11:11.914 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:11:11.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:11:12.120 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:11:12.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:11:12.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:11:12.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ4 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:11:12.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 4 % 2 2026-03-08T23:11:12.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:11:12.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:11:12.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3638: corrupt_scrub_erasure: rados --pool ecpool setxattr EOBJ4 key1-EOBJ4 val1-EOBJ4 2026-03-08T23:11:12.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3639: corrupt_scrub_erasure: rados --pool ecpool setxattr EOBJ4 key2-EOBJ4 val2-EOBJ4 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3642: corrupt_scrub_erasure: echo -n bad-val 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3643: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:11:12.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ4 set-attr _key1-EOBJ4 td/osd-scrub-repair/bad-val 2026-03-08T23:11:13.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:13.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:11:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:11:13.256 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:11:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:13.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:11:13.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:11:13.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:13.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:13.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:13.274 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:13.272+0000 7f41474ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:13.274 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:13.272+0000 7f41474ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:13.275 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:13.272+0000 7f41474ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:13.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:11:13.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:13.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:13.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:14.476 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:14.476+0000 7f41474ac8c0 -1 Falling back to public interface 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:14.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:14.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:15.451 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:15.448+0000 7f41474ac8c0 -1 osd.0 54 log_to_monitors true 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:15.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:15.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:16.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:16.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:16.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:16.941 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:16.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:16.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 58 up_thru 58 down_at 55 last_clean_interval [44,54) [v2:127.0.0.1:6802/1804449787,v1:127.0.0.1:6803/1804449787] [v2:127.0.0.1:6804/1804449787,v1:127.0.0.1:6805/1804449787] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:17.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:17.109 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:17.109 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:17.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:17.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:17.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:17.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:17.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:17.331 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:17.331 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:17.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:17.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103170 2026-03-08T23:11:17.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103170 2026-03-08T23:11:17.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170' 2026-03-08T23:11:17.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:17.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:17.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332101 2026-03-08T23:11:17.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332101 2026-03-08T23:11:17.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170 1-219043332101' 2026-03-08T23:11:17.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:17.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:17.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542159 2026-03-08T23:11:17.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542159 2026-03-08T23:11:17.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103170 1-219043332101 2-60129542159' 2026-03-08T23:11:17.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:17.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103170 2026-03-08T23:11:17.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:17.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:17.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103170 2026-03-08T23:11:17.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:17.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103170 2026-03-08T23:11:17.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103170' 2026-03-08T23:11:17.554 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103170 2026-03-08T23:11:17.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:17.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 249108103170 2026-03-08T23:11:17.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:18.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:18.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:18.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103170 -lt 249108103170 2026-03-08T23:11:18.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:18.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332101 2026-03-08T23:11:18.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:18.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:18.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332101 2026-03-08T23:11:18.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:18.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332101 2026-03-08T23:11:18.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332101' 2026-03-08T23:11:18.890 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332101 2026-03-08T23:11:18.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:19.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332101 -lt 219043332101 2026-03-08T23:11:19.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:19.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542159 2026-03-08T23:11:19.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:19.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:19.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542159 2026-03-08T23:11:19.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:19.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542159 2026-03-08T23:11:19.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542159' 2026-03-08T23:11:19.068 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 60129542159 2026-03-08T23:11:19.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:19.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542159 -lt 60129542159 2026-03-08T23:11:19.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:19.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:19.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:19.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:19.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:19.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:19.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:19.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3644: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:19.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:11:19.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:19.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:19.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:19.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:19.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:20.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:11:20.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 EOBJ4 rm-attr _key2-EOBJ4 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:21.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:11:21.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:11:21.326 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:11:21.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:21.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:11:21.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:11:21.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:21.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:21.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:21.344 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:21.340+0000 7fc979b528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:21.344 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:21.344+0000 7fc979b528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:21.346 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:21.344+0000 7fc979b528c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:21.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:11:21.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:21.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:11:21.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:21.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:21.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:21.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:21.506 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:21.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:21.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:21.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:22.304 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:22.304+0000 7fc979b528c0 -1 Falling back to public interface 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:22.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:23.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:23.284+0000 7fc979b528c0 -1 osd.2 59 log_to_monitors true 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:23.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:24.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:25.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:25.034 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:25.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:25.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 63 up_thru 63 down_at 60 last_clean_interval [14,59) [v2:127.0.0.1:6818/1656848464,v1:127.0.0.1:6819/1656848464] [v2:127.0.0.1:6820/1656848464,v1:127.0.0.1:6821/1656848464] exists,up 1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:25.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:25.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:25.210 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:25.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:25.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:25.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:25.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:25.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:25.449 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:25.449 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:25.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:25.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:25.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:25.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103173 2026-03-08T23:11:25.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103173 2026-03-08T23:11:25.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103173' 2026-03-08T23:11:25.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:25.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:25.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332103 2026-03-08T23:11:25.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332103 2026-03-08T23:11:25.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103173 1-219043332103' 2026-03-08T23:11:25.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:25.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=270582939650 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 270582939650 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103173 1-219043332103 2-270582939650' 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103173 2026-03-08T23:11:25.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:25.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:25.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103173 2026-03-08T23:11:25.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:25.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103173 2026-03-08T23:11:25.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103173' 2026-03-08T23:11:25.701 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103173 2026-03-08T23:11:25.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:25.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103173 -lt 249108103173 2026-03-08T23:11:25.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:25.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332103 2026-03-08T23:11:25.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:25.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:25.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332103 2026-03-08T23:11:25.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:25.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332103 2026-03-08T23:11:25.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332103' 2026-03-08T23:11:25.887 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332103 2026-03-08T23:11:25.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:26.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332103 -lt 219043332103 2026-03-08T23:11:26.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:26.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:26.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-270582939650 2026-03-08T23:11:26.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:26.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-270582939650 2026-03-08T23:11:26.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:26.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=270582939650 2026-03-08T23:11:26.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 270582939650' 2026-03-08T23:11:26.069 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 270582939650 2026-03-08T23:11:26.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:26.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 270582939650 -lt 270582939650 2026-03-08T23:11:26.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:26.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:26.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:26.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:26.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:26.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:26.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3645: corrupt_scrub_erasure: echo -n val3-EOBJ4 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3646: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:26.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:11:26.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:26.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:26.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:26.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:26.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:11:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 EOBJ4 set-attr _key3-EOBJ4 td/osd-scrub-repair/newval 2026-03-08T23:11:28.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:28.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:11:28.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:11:28.104 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:11:28.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:28.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:11:28.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:11:28.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:28.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:28.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:28.121 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:28.120+0000 7f718025b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:28.122 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:28.120+0000 7f718025b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:28.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:28.124+0000 7f718025b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:28.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:28.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:29.316 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:29.316+0000 7f718025b8c0 -1 Falling back to public interface 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:29.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:29.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:30.551 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:30.552+0000 7f718025b8c0 -1 osd.2 64 log_to_monitors true 2026-03-08T23:11:30.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:30.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:30.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:30.620 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:30.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:30.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:30.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:31.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:11:31.963 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 68 up_thru 68 down_at 65 last_clean_interval [63,64) [v2:127.0.0.1:6818/1744656110,v1:127.0.0.1:6819/1744656110] [v2:127.0.0.1:6820/1744656110,v1:127.0.0.1:6821/1744656110] exists,up 1b9d9246-36ef-44da-b279-b56a967bacd4 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:31.964 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:31.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:31.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:31.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:32.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:32.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103175 2026-03-08T23:11:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103175 2026-03-08T23:11:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175' 2026-03-08T23:11:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:32.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=219043332106 2026-03-08T23:11:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 219043332106 2026-03-08T23:11:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175 1-219043332106' 2026-03-08T23:11:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:32.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776130 2026-03-08T23:11:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776130 2026-03-08T23:11:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103175 1-219043332106 2-292057776130' 2026-03-08T23:11:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:32.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103175 2026-03-08T23:11:32.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:32.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:32.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103175 2026-03-08T23:11:32.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:32.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103175 2026-03-08T23:11:32.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103175' 2026-03-08T23:11:32.421 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103175 2026-03-08T23:11:32.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:32.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103174 -lt 249108103175 2026-03-08T23:11:32.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:33.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:33.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:33.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103174 -lt 249108103175 2026-03-08T23:11:33.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:34.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:11:34.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:34.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103175 -lt 249108103175 2026-03-08T23:11:34.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:34.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-219043332106 2026-03-08T23:11:34.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:34.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:34.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-219043332106 2026-03-08T23:11:34.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:34.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=219043332106 2026-03-08T23:11:34.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 219043332106' 2026-03-08T23:11:34.931 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 219043332106 2026-03-08T23:11:34.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:35.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 219043332106 -lt 219043332106 2026-03-08T23:11:35.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:35.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776130 2026-03-08T23:11:35.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:35.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:35.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776130 2026-03-08T23:11:35.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:35.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776130 2026-03-08T23:11:35.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776130' 2026-03-08T23:11:35.106 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776130 2026-03-08T23:11:35.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:35.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776130 -lt 292057776130 2026-03-08T23:11:35.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:35.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:35.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:35.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:35.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:35.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:35.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:35.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:35.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:35.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3647: corrupt_scrub_erasure: rm td/osd-scrub-repair/bad-val td/osd-scrub-repair/newval 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ5 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ5 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ5 2026-03-08T23:11:35.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:11:35.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:11:35.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:11:36.064 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:11:36.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:11:36.271 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:11:36.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:11:36.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:11:36.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ5 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:11:36.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 5 % 2 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3652: corrupt_scrub_erasure: dd if=/dev/urandom of=td/osd-scrub-repair/CORRUPT bs=2048 count=2 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records in 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records out 2026-03-08T23:11:36.316 INFO:tasks.workunit.client.0.vm03.stderr:4096 bytes (4.1 kB, 4.0 KiB) copied, 6.0795e-05 s, 67.4 MB/s 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3653: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:36.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:36.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ5 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:11:37.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:11:37.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:37.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:37.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:11:37.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:37.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:11:37.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:11:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:37.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:37.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:37.642 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:37.640+0000 7fda099418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:37.642 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:37.640+0000 7fda099418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:37.644 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:37.644+0000 7fda099418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:37.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:37.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:38.596 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:38.596+0000 7fda099418c0 -1 Falling back to public interface 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:38.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:39.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:39.561 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:39.560+0000 7fda099418c0 -1 osd.1 71 log_to_monitors true 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:40.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:40.479 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:40.476+0000 7fda008f1640 -1 osd.1 71 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:41.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 75 up_thru 75 down_at 72 last_clean_interval [51,71) [v2:127.0.0.1:6810/3734083974,v1:127.0.0.1:6811/3734083974] [v2:127.0.0.1:6812/3734083974,v1:127.0.0.1:6813/3734083974] exists,up 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:41.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:41.502 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:41.502 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:41.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:41.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:41.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:41.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:41.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:41.727 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:41.727 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:41.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:41.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:41.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:41.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=249108103178 2026-03-08T23:11:41.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 249108103178 2026-03-08T23:11:41.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103178' 2026-03-08T23:11:41.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:41.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:41.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547202 2026-03-08T23:11:41.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547202 2026-03-08T23:11:41.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103178 1-322122547202' 2026-03-08T23:11:41.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:41.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:41.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776133 2026-03-08T23:11:41.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776133 2026-03-08T23:11:41.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-249108103178 1-322122547202 2-292057776133' 2026-03-08T23:11:41.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:41.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-249108103178 2026-03-08T23:11:41.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:41.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:41.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-249108103178 2026-03-08T23:11:41.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:41.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=249108103178 2026-03-08T23:11:41.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 249108103178' 2026-03-08T23:11:41.986 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 249108103178 2026-03-08T23:11:41.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:42.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103177 -lt 249108103178 2026-03-08T23:11:42.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:43.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:43.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:43.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103177 -lt 249108103178 2026-03-08T23:11:43.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:44.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:11:44.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:44.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 249108103178 -lt 249108103178 2026-03-08T23:11:44.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:44.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-322122547202 2026-03-08T23:11:44.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:44.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:44.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:44.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-322122547202 2026-03-08T23:11:44.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547202 2026-03-08T23:11:44.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 322122547202' 2026-03-08T23:11:44.503 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 322122547202 2026-03-08T23:11:44.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:44.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547202 -lt 322122547202 2026-03-08T23:11:44.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:44.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776133 2026-03-08T23:11:44.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:44.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:44.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776133 2026-03-08T23:11:44.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:44.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776133 2026-03-08T23:11:44.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776133' 2026-03-08T23:11:44.679 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776133 2026-03-08T23:11:44.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:44.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776133 -lt 292057776133 2026-03-08T23:11:44.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:44.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:44.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:45.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:45.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:45.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:45.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:45.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:45.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:45.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:45.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:45.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ6 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ6 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ6 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:11:45.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:11:45.636 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:11:45.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:11:45.843 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:11:45.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:11:45.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:11:45.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ6 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:11:45.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 6 % 2 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=0 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3657: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:45.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:45.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:45.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:45.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:11:45.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ6 rm-attr hinfo_key 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:47.431 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:47.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:47.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:11:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:11:47.434 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:11:47.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:47.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:11:47.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:11:47.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:47.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:47.451 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:47.448+0000 7f5051efb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:47.452 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:47.452+0000 7f5051efb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:47.453 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:47.452+0000 7f5051efb8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:47.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:47.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:48.656 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:48.656+0000 7f5051efb8c0 -1 Falling back to public interface 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:48.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:48.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:49.889 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:49.888+0000 7f5051efb8c0 -1 osd.0 78 log_to_monitors true 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:49.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:50.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:51.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:11:51.320 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 82 up_thru 82 down_at 79 last_clean_interval [58,78) [v2:127.0.0.1:6802/2854131564,v1:127.0.0.1:6803/2854131564] [v2:127.0.0.1:6804/2854131564,v1:127.0.0.1:6805/2854131564] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:11:51.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:11:51.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:51.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:11:51.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318274 2026-03-08T23:11:51.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318274 2026-03-08T23:11:51.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274' 2026-03-08T23:11:51.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:51.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:11:51.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=322122547205 2026-03-08T23:11:51.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 322122547205 2026-03-08T23:11:51.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274 1-322122547205' 2026-03-08T23:11:51.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:11:51.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776136 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776136 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274 1-322122547205 2-292057776136' 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318274 2026-03-08T23:11:51.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:51.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:11:51.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318274 2026-03-08T23:11:51.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:51.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318274 2026-03-08T23:11:51.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318274' 2026-03-08T23:11:51.816 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318274 2026-03-08T23:11:51.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:11:51.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318274 -lt 352187318274 2026-03-08T23:11:51.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:51.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-322122547205 2026-03-08T23:11:51.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:51.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:11:51.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-322122547205 2026-03-08T23:11:51.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:51.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=322122547205 2026-03-08T23:11:51.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 322122547205' 2026-03-08T23:11:51.993 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 322122547205 2026-03-08T23:11:51.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:11:52.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 322122547205 -lt 322122547205 2026-03-08T23:11:52.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:11:52.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776136 2026-03-08T23:11:52.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:11:52.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:11:52.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776136 2026-03-08T23:11:52.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:11:52.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776136 2026-03-08T23:11:52.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776136' 2026-03-08T23:11:52.160 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776136 2026-03-08T23:11:52.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:52.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776135 -lt 292057776136 2026-03-08T23:11:52.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:53.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:11:53.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:53.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776135 -lt 292057776136 2026-03-08T23:11:53.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:11:54.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:11:54.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:11:54.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776136 -lt 292057776136 2026-03-08T23:11:54.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:11:54.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:54.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:54.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:11:54.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:11:55.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:11:55.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:11:55.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:11:55.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:11:55.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:11:55.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3658: corrupt_scrub_erasure: echo -n bad-val 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3659: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:11:55.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:55.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 EOBJ6 set-attr hinfo_key td/osd-scrub-repair/bad-val 2026-03-08T23:11:56.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:11:56.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:11:56.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:11:56.908 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:11:56.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:11:56.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:11:56.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:11:56.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:11:56.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:11:56.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:11:56.925 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:56.924+0000 7f06671918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:56.942 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:56.924+0000 7f06671918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:56.942 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:56.924+0000 7f06671918c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:57.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:57.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:58.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:58.124+0000 7f06671918c0 -1 Falling back to public interface 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:58.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:58.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:11:59.091 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:11:59.088+0000 7f06671918c0 -1 osd.1 83 log_to_monitors true 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:11:59.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:11:59.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:00.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 87 up_thru 87 down_at 84 last_clean_interval [75,83) [v2:127.0.0.1:6810/3995992482,v1:127.0.0.1:6811/3995992482] [v2:127.0.0.1:6812/3995992482,v1:127.0.0.1:6813/3995992482] exists,up 55610426-1f0a-4abd-b96b-732daaa923cb 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:12:00.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:12:00.788 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:00.788 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:00.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:01.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:01.030 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:01.030 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:12:01.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:01.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:01.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:01.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318277 2026-03-08T23:12:01.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318277 2026-03-08T23:12:01.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318277' 2026-03-08T23:12:01.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:01.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154754 2026-03-08T23:12:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154754 2026-03-08T23:12:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318277 1-373662154754' 2026-03-08T23:12:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:01.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776139 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776139 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318277 1-373662154754 2-292057776139' 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318277 2026-03-08T23:12:01.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:01.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:01.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:01.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318277 2026-03-08T23:12:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318277 2026-03-08T23:12:01.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318277' 2026-03-08T23:12:01.276 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318277 2026-03-08T23:12:01.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:01.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318275 -lt 352187318277 2026-03-08T23:12:01.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:02.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:02.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:02.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318277 -lt 352187318277 2026-03-08T23:12:02.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:02.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154754 2026-03-08T23:12:02.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:02.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:02.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154754 2026-03-08T23:12:02.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:02.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154754 2026-03-08T23:12:02.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154754' 2026-03-08T23:12:02.628 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154754 2026-03-08T23:12:02.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:02.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154754 -lt 373662154754 2026-03-08T23:12:02.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:02.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776139 2026-03-08T23:12:02.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:02.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:12:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776139 2026-03-08T23:12:02.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776139 2026-03-08T23:12:02.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776139' 2026-03-08T23:12:02.816 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776139 2026-03-08T23:12:02.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:12:02.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776139 -lt 292057776139 2026-03-08T23:12:02.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:12:02.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:02.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:03.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:12:03.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:12:03.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:12:03.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:12:03.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:12:03.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:03.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3612: corrupt_scrub_erasure: for i in $(seq 1 $total_objs) 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3613: corrupt_scrub_erasure: objname=EOBJ7 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3614: corrupt_scrub_erasure: add_something td/osd-scrub-repair ecpool EOBJ7 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=EOBJ7 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:12:03.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:12:03.879 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:12:03.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:12:04.091 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:12:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:12:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:12:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put EOBJ7 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:12:04.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: expr 7 % 2 2026-03-08T23:12:04.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3616: corrupt_scrub_erasure: local osd=1 2026-03-08T23:12:04.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3618: corrupt_scrub_erasure: case $i in 2026-03-08T23:12:04.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3663: corrupt_scrub_erasure: local payload=MAKETHISDIFFERENTFROMOTHEROBJECTS 2026-03-08T23:12:04.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3664: corrupt_scrub_erasure: echo MAKETHISDIFFERENTFROMOTHEROBJECTS 2026-03-08T23:12:04.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3665: corrupt_scrub_erasure: rados --pool ecpool put EOBJ7 td/osd-scrub-repair/DIFFERENT 2026-03-08T23:12:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3668: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:12:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:12:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:12:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:12:04.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:12:04.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ1 get-attr hinfo_key 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:12:04.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ1 get-attr hinfo_key 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:12:05.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:12:05.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:12:05.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:12:05.169 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:12:05.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:12:05.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:12:05.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:12:05.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:12:05.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:12:05.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:12:05.185 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:05.184+0000 7f54636258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:05.195 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:05.196+0000 7f54636258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:05.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:05.196+0000 7f54636258c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:12:05.350 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:12:05.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:05.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:05.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:05.656 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:05.656+0000 7f54636258c0 -1 Falling back to public interface 2026-03-08T23:12:06.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:06.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:06.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:12:06.519 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:06.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:06.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:06.675 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:06.676+0000 7f54636258c0 -1 osd.0 90 log_to_monitors true 2026-03-08T23:12:06.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:07.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:07.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:07.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:12:07.698 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:12:07.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:07.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:07.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:08.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:08.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:08.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:12:08.884 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:12:08.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:08.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:09.049 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 94 up_thru 94 down_at 91 last_clean_interval [82,90) [v2:127.0.0.1:6802/896469679,v1:127.0.0.1:6803/896469679] [v2:127.0.0.1:6804/896469679,v1:127.0.0.1:6805/896469679] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:12:09.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:09.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:09.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:09.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:09.303 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:09.303 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:12:09.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:09.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:09.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:09.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=403726925826 2026-03-08T23:12:09.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 403726925826 2026-03-08T23:12:09.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826' 2026-03-08T23:12:09.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:09.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:09.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154757 2026-03-08T23:12:09.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154757 2026-03-08T23:12:09.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826 1-373662154757' 2026-03-08T23:12:09.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:09.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776141 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776141 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-403726925826 1-373662154757 2-292057776141' 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-403726925826 2026-03-08T23:12:09.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:09.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:09.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-403726925826 2026-03-08T23:12:09.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:09.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=403726925826 2026-03-08T23:12:09.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 403726925826' 2026-03-08T23:12:09.657 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 403726925826 2026-03-08T23:12:09.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:09.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 403726925826 2026-03-08T23:12:09.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:10.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:10.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 403726925826 -lt 403726925826 2026-03-08T23:12:10.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154757 2026-03-08T23:12:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:10.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:10.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154757 2026-03-08T23:12:10.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:10.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154757 2026-03-08T23:12:10.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154757' 2026-03-08T23:12:10.994 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154757 2026-03-08T23:12:10.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:11.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154757 -lt 373662154757 2026-03-08T23:12:11.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:11.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776141 2026-03-08T23:12:11.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:11.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:12:11.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776141 2026-03-08T23:12:11.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:11.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776141 2026-03-08T23:12:11.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776141' 2026-03-08T23:12:11.153 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776141 2026-03-08T23:12:11.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:12:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776141 -lt 292057776141 2026-03-08T23:12:11.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:12:11.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:11.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:12:11.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:12:11.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:12:11.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:12:11.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:11.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3669: corrupt_scrub_erasure: objectstore_tool td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:12:11.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:12:11.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:12:12.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 EOBJ7 set-attr hinfo_key td/osd-scrub-repair/hinfo 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:13.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:13.424 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:13.424 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:13.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:12:13.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:12:13.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:12:13.426 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:12:13.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:12:13.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:12:13.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:12:13.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:12:13.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:12:13.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:12:13.446 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:13.444+0000 7f58389268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:13.446 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:13.444+0000 7f58389268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:13.448 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:13.448+0000 7f58389268c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:13.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:13.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:14.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:14.900 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:14.900+0000 7f58389268c0 -1 Falling back to public interface 2026-03-08T23:12:14.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:15.884 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:15.884+0000 7f58389268c0 -1 osd.0 95 log_to_monitors true 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:15.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:16.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:16.792 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:16.792+0000 7f582f8d6640 -1 osd.0 95 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:17.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 99 up_thru 99 down_at 96 last_clean_interval [94,95) [v2:127.0.0.1:6802/791527409,v1:127.0.0.1:6803/791527409] [v2:127.0.0.1:6804/791527409,v1:127.0.0.1:6805/791527409] exists,up e680aa22-886e-4c4b-92d7-00ecd80dc50c 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:12:17.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:12:17.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:17.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:17.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:17.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:17.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:17.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:17.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:17.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762306 2026-03-08T23:12:17.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762306 2026-03-08T23:12:17.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306' 2026-03-08T23:12:17.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:17.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:17.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154759 2026-03-08T23:12:17.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154759 2026-03-08T23:12:17.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306 1-373662154759' 2026-03-08T23:12:17.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:17.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:12:17.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776144 2026-03-08T23:12:17.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776144 2026-03-08T23:12:17.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762306 1-373662154759 2-292057776144' 2026-03-08T23:12:17.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:17.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762306 2026-03-08T23:12:17.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:17.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:17.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762306 2026-03-08T23:12:17.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:17.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762306 2026-03-08T23:12:17.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762306' 2026-03-08T23:12:17.760 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 425201762306 2026-03-08T23:12:17.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:18.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762306 -lt 425201762306 2026-03-08T23:12:18.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:18.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154759 2026-03-08T23:12:18.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:18.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:18.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154759 2026-03-08T23:12:18.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154759 2026-03-08T23:12:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154759' 2026-03-08T23:12:18.008 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 373662154759 2026-03-08T23:12:18.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:18.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154759 -lt 373662154759 2026-03-08T23:12:18.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:18.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776144 2026-03-08T23:12:18.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:18.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:12:18.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776144 2026-03-08T23:12:18.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776144 2026-03-08T23:12:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776144' 2026-03-08T23:12:18.174 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 292057776144 2026-03-08T23:12:18.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:12:18.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776144 -lt 292057776144 2026-03-08T23:12:18.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:12:18.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:18.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:12:18.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:12:18.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:12:18.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:12:18.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:18.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:18.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3670: corrupt_scrub_erasure: rm -f td/osd-scrub-repair/hinfo 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3676: corrupt_scrub_erasure: get_pg ecpool EOBJ0 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:12:18.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=EOBJ0 2026-03-08T23:12:18.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool EOBJ0 2026-03-08T23:12:18.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:12:19.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3676: corrupt_scrub_erasure: local pg=3.0 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3678: corrupt_scrub_erasure: pg_scrub 3.0 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=3.0 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 3.0 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:19.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:19.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:19.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:19.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:19.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762307 2026-03-08T23:12:19.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762307 2026-03-08T23:12:19.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762307' 2026-03-08T23:12:19.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:19.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154761 2026-03-08T23:12:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154761 2026-03-08T23:12:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762307 1-373662154761' 2026-03-08T23:12:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:19.582 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776145 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776145 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762307 1-373662154761 2-292057776145' 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762307 2026-03-08T23:12:19.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:19.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:19.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762307 2026-03-08T23:12:19.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:19.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762307 2026-03-08T23:12:19.661 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 425201762307 2026-03-08T23:12:19.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762307' 2026-03-08T23:12:19.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:19.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762306 -lt 425201762307 2026-03-08T23:12:19.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:20.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:20.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:20.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762307 -lt 425201762307 2026-03-08T23:12:20.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:20.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154761 2026-03-08T23:12:20.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:20.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:20.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154761 2026-03-08T23:12:20.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:20.991 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 373662154761 2026-03-08T23:12:20.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154761 2026-03-08T23:12:20.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154761' 2026-03-08T23:12:20.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:21.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154761 -lt 373662154761 2026-03-08T23:12:21.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:21.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776145 2026-03-08T23:12:21.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:21.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:12:21.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776145 2026-03-08T23:12:21.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:21.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776145 2026-03-08T23:12:21.155 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 292057776145 2026-03-08T23:12:21.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776145' 2026-03-08T23:12:21.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:12:21.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776145 -lt 292057776145 2026-03-08T23:12:21.322 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:12:21.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:12:21.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:12:21.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:12:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:12:21.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:12:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:12:21.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 3.0 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:21.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:21.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:21.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 3.0 2026-03-08T23:12:21.712 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0s0 on osd.1 to scrub 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 3.0 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:21.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:21.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:22.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:22.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:23.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:24.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:24.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:24.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:12:24.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:24.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:24.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:24.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:24.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:24.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:25.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:25.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:25.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:25.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:26.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:12:26.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:22.113620+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:26.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:12:26.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3680: corrupt_scrub_erasure: rados list-inconsistent-pg ecpool 2026-03-08T23:12:26.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3682: corrupt_scrub_erasure: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:12:26.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3682: corrupt_scrub_erasure: test 1 = 1 2026-03-08T23:12:26.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3684: corrupt_scrub_erasure: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:12:26.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3684: corrupt_scrub_erasure: test 3.0 = 3.0 2026-03-08T23:12:26.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3686: corrupt_scrub_erasure: rados list-inconsistent-obj 3.0 2026-03-08T23:12:26.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3688: corrupt_scrub_erasure: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3688: corrupt_scrub_erasure: epoch=99 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:12:26.643 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:12:26.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:12:26.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3690: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:12:26.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:12:26.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4310: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:12:26.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4311: corrupt_scrub_erasure: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4312: corrupt_scrub_erasure: test no = yes 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4317: corrupt_scrub_erasure: test '' = yes 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4322: corrupt_scrub_erasure: pg_deep_scrub 3.0 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=3.0 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 3.0 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:12:26.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:12:26.693 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:26.693 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:26.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:26.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:26.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:26.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:27.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:27.038 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:12:27.039 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:12:27.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:27.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:27.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:27.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762310 2026-03-08T23:12:27.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762310 2026-03-08T23:12:27.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310' 2026-03-08T23:12:27.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:27.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:27.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=373662154763 2026-03-08T23:12:27.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 373662154763 2026-03-08T23:12:27.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310 1-373662154763' 2026-03-08T23:12:27.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:27.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:12:27.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=292057776148 2026-03-08T23:12:27.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 292057776148 2026-03-08T23:12:27.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-425201762310 1-373662154763 2-292057776148' 2026-03-08T23:12:27.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:27.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-425201762310 2026-03-08T23:12:27.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:27.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:27.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-425201762310 2026-03-08T23:12:27.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:27.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762310 2026-03-08T23:12:27.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 425201762310' 2026-03-08T23:12:27.282 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 425201762310 2026-03-08T23:12:27.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:27.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762308 -lt 425201762310 2026-03-08T23:12:27.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:28.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:28.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:28.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762310 -lt 425201762310 2026-03-08T23:12:28.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:28.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-373662154763 2026-03-08T23:12:28.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:28.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:28.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-373662154763 2026-03-08T23:12:28.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:28.641 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 373662154763 2026-03-08T23:12:28.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=373662154763 2026-03-08T23:12:28.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 373662154763' 2026-03-08T23:12:28.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:28.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 373662154763 -lt 373662154763 2026-03-08T23:12:28.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:28.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-292057776148 2026-03-08T23:12:28.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:28.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:12:28.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-292057776148 2026-03-08T23:12:28.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:28.817 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 292057776148 2026-03-08T23:12:28.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=292057776148 2026-03-08T23:12:28.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 292057776148' 2026-03-08T23:12:28.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:12:28.992 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:12:28.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 292057776148 -lt 292057776148 2026-03-08T23:12:28.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:12:28.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean+inconsistent 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean+inconsistent == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:29.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:29.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:29.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 3.0 2026-03-08T23:12:29.411 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0s0 on osd.1 to deep-scrub 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 3.0 2026-03-08T23:10:36.690912+0000 last_deep_scrub_stamp 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:29.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:29.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:29.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:30.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:30.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:30.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:31.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:31.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:31.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:32.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:33.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:34.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:34.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:34.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:34.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:34.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:34.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:34.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:34.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:34.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:35.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:35.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:10:36.690912+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:35.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:12:36.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:12:36.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:30.251180+0000 '>' 2026-03-08T23:10:36.690912+0000 2026-03-08T23:12:36.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:12:36.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4324: corrupt_scrub_erasure: rados list-inconsistent-pg ecpool 2026-03-08T23:12:36.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4326: corrupt_scrub_erasure: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:12:36.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4326: corrupt_scrub_erasure: test 1 = 1 2026-03-08T23:12:36.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4328: corrupt_scrub_erasure: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:12:36.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4328: corrupt_scrub_erasure: test 3.0 = 3.0 2026-03-08T23:12:36.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4330: corrupt_scrub_erasure: rados list-inconsistent-obj 3.0 2026-03-08T23:12:36.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4332: corrupt_scrub_erasure: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4332: corrupt_scrub_erasure: epoch=99 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4334: corrupt_scrub_erasure: '[' true = true ']' 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4336: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:12:36.733 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:12:36.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4336: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:12:36.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:4336: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: jq 'def walk(f): 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:12:36.759 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:12:36.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5724: corrupt_scrub_erasure: jq .inconsistents 2026-03-08T23:12:36.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5725: corrupt_scrub_erasure: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:12:36.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:12:36.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5726: corrupt_scrub_erasure: test no = yes 2026-03-08T23:12:36.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5737: corrupt_scrub_erasure: test '' = yes 2026-03-08T23:12:36.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5742: corrupt_scrub_erasure: ceph osd pool rm ecpool ecpool --yes-i-really-really-mean-it 2026-03-08T23:12:37.003 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' removed 2026-03-08T23:12:37.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:12:37.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:12:37.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:12:37.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:12:37.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:12:37.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:12:37.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:12:37.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:12:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:12:37.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:12:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:12:37.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:12:37.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:12:37.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:12:37.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:12:37.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:12:37.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:12:37.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:12:37.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:12:37.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:12:37.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:12:37.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:12:37.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:12:37.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:12:37.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:12:37.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:12:37.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:12:37.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:12:37.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:12:37.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:12:37.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:12:37.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:12:37.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:12:37.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:12:37.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:12:37.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:12:37.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:12:37.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:12:37.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:12:37.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:12:37.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:12:37.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:12:37.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:12:37.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:12:37.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:12:37.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:12:37.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:12:37.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:12:37.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_scrub_replicated td/osd-scrub-repair 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:991: TEST_corrupt_scrub_replicated: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:992: TEST_corrupt_scrub_replicated: local poolname=csr_pool 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:993: TEST_corrupt_scrub_replicated: local total_objs=19 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:995: TEST_corrupt_scrub_replicated: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:12:37.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T23:12:37.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:37.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:12:37.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:12:37.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:12:37.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:12:37.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:12:37.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:12:37.257 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:12:37.257 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:12:37.257 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:12:37.257 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:12:37.257 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.258 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.258 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:12:37.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:12:37.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:12:37.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:12:37.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:12:37.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:12:37.334 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:12:37.335 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:12:37.335 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.335 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.335 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:12:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:12:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:996: TEST_corrupt_scrub_replicated: run_mgr td/osd-scrub-repair x 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:12:37.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:37.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:12:37.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:997: TEST_corrupt_scrub_replicated: run_osd td/osd-scrub-repair 0 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:12:37.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:37.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:12:37.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:12:37.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:12:37.580 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:12:37.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:12:37.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 b8b43426-8172-41a4-ad76-a875625b04e6' 2026-03-08T23:12:37.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:12:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBlAq5povl/IxAAdU0gyFkkMiSOO8ZcZOOqwA== 2026-03-08T23:12:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBlAq5povl/IxAAdU0gyFkkMiSOO8ZcZOOqwA=="}' 2026-03-08T23:12:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b8b43426-8172-41a4-ad76-a875625b04e6 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:12:37.698 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:12:37.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:12:37.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBlAq5povl/IxAAdU0gyFkkMiSOO8ZcZOOqwA== --osd-uuid b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:12:37.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:37.733+0000 7f4d6442c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:37.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:37.737+0000 7f4d6442c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:37.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:37.737+0000 7f4d6442c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:37.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:37.737+0000 7f4d6442c8c0 -1 bdev(0x55d390a5cc00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:12:37.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:37.737+0000 7f4d6442c8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:12:39.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:12:39.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:12:39.998 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:12:39.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:12:39.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:12:40.108 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:12:40.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:12:40.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:12:40.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:12:40.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:12:40.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:12:40.159 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:40.153+0000 7f3b9a2ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:40.168 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:40.169+0000 7f3b9a2ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:40.179 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:40.177+0000 7f3b9a2ac8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:40.249 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:12:40.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:40.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:40.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:40.644 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:40.645+0000 7f3b9a2ac8c0 -1 Falling back to public interface 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:41.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:41.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:42.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:42.113+0000 7f3b9a2ac8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:42.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:42.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:43.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:12:43.864 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3955357756,v1:127.0.0.1:6803/3955357756] [v2:127.0.0.1:6804/3955357756,v1:127.0.0.1:6805/3955357756] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:998: TEST_corrupt_scrub_replicated: run_osd td/osd-scrub-repair 1 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:12:43.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:12:43.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:12:43.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:12:43.868 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:12:43.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:12:43.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 f603f62e-24cf-4fca-903d-50e492fba08d' 2026-03-08T23:12:43.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:12:43.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBrAq5pt9DMNBAAh4bmgXl+dsfZFLwcr2YRhQ== 2026-03-08T23:12:43.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBrAq5pt9DMNBAAh4bmgXl+dsfZFLwcr2YRhQ=="}' 2026-03-08T23:12:43.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f603f62e-24cf-4fca-903d-50e492fba08d -i td/osd-scrub-repair/1/new.json 2026-03-08T23:12:44.045 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:12:44.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:12:44.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBrAq5pt9DMNBAAh4bmgXl+dsfZFLwcr2YRhQ== --osd-uuid f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:12:44.077 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:44.077+0000 7fcf6751f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:44.078 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:44.077+0000 7fcf6751f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:44.079 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:44.081+0000 7fcf6751f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:44.080 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:44.081+0000 7fcf6751f8c0 -1 bdev(0x55db8dd4bc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:12:44.080 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:44.081+0000 7fcf6751f8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:12:46.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:12:46.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:12:46.342 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:12:46.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:12:46.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:12:46.558 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:12:46.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:12:46.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:12:46.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:12:46.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:12:46.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:12:46.587 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:46.585+0000 7f03b65aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:46.596 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:46.597+0000 7f03b65aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:46.603 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:46.597+0000 7f03b65aa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:12:46.748 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:46.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:12:46.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:12:48.044 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:48.045+0000 7f03b65aa8c0 -1 Falling back to public interface 2026-03-08T23:12:48.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:49.017 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:49.017+0000 7f03b65aa8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:49.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:12:49.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:12:50.070 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:12:50.069+0000 7f03b2564640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:12:50.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3601883572,v1:127.0.0.1:6811/3601883572] [v2:127.0.0.1:6812/3601883572,v1:127.0.0.1:6813/3601883572] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:999: TEST_corrupt_scrub_replicated: create_rbd_pool 2026-03-08T23:12:50.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:12:50.645 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:12:50.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:12:50.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:12:50.877 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:12:50.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:12:51.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:12:52.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1000: TEST_corrupt_scrub_replicated: wait_for_clean 2026-03-08T23:12:52.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:12:52.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:52.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:52.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:12:52.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:12:52.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:12:52.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:12:52.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:52.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:52.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:52.417 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:12:52.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:52.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:52.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:52.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:12:52.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:12:52.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:12:52.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:52.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:12:52.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:52.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:52.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:12:52.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:52.585 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:12:52.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:12:52.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:12:52.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:12:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:53.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:53.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:53.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:12:53.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:54.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:12:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836483 2026-03-08T23:12:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:55.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:12:55.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:55.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:12:55.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:12:55.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:55.101 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:12:55.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:12:55.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:12:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:12:55.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:12:55.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:12:55.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:55.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:12:55.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:12:55.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:12:55.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:12:55.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:12:55.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:12:55.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:12:55.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:12:55.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:12:55.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:12:55.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1002: TEST_corrupt_scrub_replicated: create_pool foo 1 2026-03-08T23:12:55.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create foo 1 2026-03-08T23:12:56.087 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' created 2026-03-08T23:12:56.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:12:57.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1003: TEST_corrupt_scrub_replicated: create_pool csr_pool 1 1 2026-03-08T23:12:57.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create csr_pool 1 1 2026-03-08T23:12:57.363 INFO:tasks.workunit.client.0.vm03.stderr:pool 'csr_pool' created 2026-03-08T23:12:57.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:12:58.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1004: TEST_corrupt_scrub_replicated: wait_for_clean 2026-03-08T23:12:58.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:12:58.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:12:58.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:12:58.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:12:58.643 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:12:58.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:12:58.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:58.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:12:58.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T23:12:58.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T23:12:58.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T23:12:58.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:12:58.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:12:58.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T23:12:58.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T23:12:58.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672964' 2026-03-08T23:12:58.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:12:58.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T23:12:58.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:12:58.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:12:58.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T23:12:58.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:12:58.809 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T23:12:58.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T23:12:58.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T23:12:58.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:12:58.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T23:12:58.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:12:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:12:59.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:00.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T23:13:00.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:01.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:13:01.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:01.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T23:13:01.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:01.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T23:13:01.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:01.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:01.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T23:13:01.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:01.347 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T23:13:01.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T23:13:01.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T23:13:01.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:01.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672964 2026-03-08T23:13:01.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:13:01.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:01.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:13:01.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:13:01.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:13:01.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:13:01.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:01.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:02.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:13:02.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:13:02.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:13:02.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: seq 1 19 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ1 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ1 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ1 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:02.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:02.363 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:02.568 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:02.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:02.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:02.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:02.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ1 hdr-ROBJ1 2026-03-08T23:13:02.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ1 key-ROBJ1 val-ROBJ1 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ2 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ2 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ2 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:02.879 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:02.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:03.089 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:03.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:03.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:03.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:03.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ2 hdr-ROBJ2 2026-03-08T23:13:03.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ2 key-ROBJ2 val-ROBJ2 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ3 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ3 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ3 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:03.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:03.389 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:03.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:03.598 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:03.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:03.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:03.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ3 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:03.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ3 hdr-ROBJ3 2026-03-08T23:13:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ3 key-ROBJ3 val-ROBJ3 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ4 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ4 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ4 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:03.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:03.892 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:03.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:04.102 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:04.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:04.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:04.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ4 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:04.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ4 hdr-ROBJ4 2026-03-08T23:13:04.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ4 key-ROBJ4 val-ROBJ4 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ5 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ5 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ5 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:04.542 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:04.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:04.748 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:04.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:04.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:04.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ5 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:04.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ5 hdr-ROBJ5 2026-03-08T23:13:04.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ5 key-ROBJ5 val-ROBJ5 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ6 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ6 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ6 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:04.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:05.050 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:05.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:05.260 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:05.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:05.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:05.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ6 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:05.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ6 hdr-ROBJ6 2026-03-08T23:13:05.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ6 key-ROBJ6 val-ROBJ6 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ7 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ7 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ7 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:05.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:05.562 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:05.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:05.771 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:05.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:05.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:05.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ7 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:05.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ7 hdr-ROBJ7 2026-03-08T23:13:05.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ7 key-ROBJ7 val-ROBJ7 2026-03-08T23:13:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ8 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ8 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ8 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:05.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:06.080 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:06.515 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:06.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:06.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:06.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ8 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:06.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ8 hdr-ROBJ8 2026-03-08T23:13:06.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ8 key-ROBJ8 val-ROBJ8 2026-03-08T23:13:06.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ9 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ9 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ9 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:06.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:06.820 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:06.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:07.030 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:07.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:07.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:07.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ9 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:07.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ9 hdr-ROBJ9 2026-03-08T23:13:07.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ9 key-ROBJ9 val-ROBJ9 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ10 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ10 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ10 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:07.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:07.352 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:07.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:07.560 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:07.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:07.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:07.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ10 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:07.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ10 hdr-ROBJ10 2026-03-08T23:13:07.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ10 key-ROBJ10 val-ROBJ10 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ11 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ11 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ11 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:07.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:07.855 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:07.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:08.065 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:08.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:08.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:08.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ11 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:08.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ11 hdr-ROBJ11 2026-03-08T23:13:08.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ11 key-ROBJ11 val-ROBJ11 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ12 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ12 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:08.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ12 2026-03-08T23:13:08.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:08.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:08.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:08.380 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:08.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:08.588 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:08.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:08.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:08.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ12 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:08.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ12 hdr-ROBJ12 2026-03-08T23:13:08.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ12 key-ROBJ12 val-ROBJ12 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ13 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ13 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ13 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:08.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:08.895 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:08.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:09.119 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:09.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:09.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:09.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ13 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:09.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ13 hdr-ROBJ13 2026-03-08T23:13:09.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ13 key-ROBJ13 val-ROBJ13 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ14 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ14 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ14 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:09.435 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:09.713 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:09.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:09.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:09.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ14 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:09.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ14 hdr-ROBJ14 2026-03-08T23:13:09.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ14 key-ROBJ14 val-ROBJ14 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ15 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ15 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ15 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:09.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:10.014 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:10.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:10.223 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:10.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:10.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:10.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ15 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:10.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ15 hdr-ROBJ15 2026-03-08T23:13:10.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ15 key-ROBJ15 val-ROBJ15 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ16 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ16 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ16 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:10.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:10.604 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:10.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:10.853 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:10.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:10.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:10.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ16 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:10.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ16 hdr-ROBJ16 2026-03-08T23:13:10.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ16 key-ROBJ16 val-ROBJ16 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ17 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ17 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ17 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:10.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:11.169 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:11.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:11.446 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ17 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:11.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ17 hdr-ROBJ17 2026-03-08T23:13:11.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ17 key-ROBJ17 val-ROBJ17 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ18 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ18 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ18 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:11.753 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:11.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:12.069 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:12.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:12.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:12.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ18 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:12.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ18 hdr-ROBJ18 2026-03-08T23:13:12.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ18 key-ROBJ18 val-ROBJ18 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1006: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1007: TEST_corrupt_scrub_replicated: objname=ROBJ19 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1008: TEST_corrupt_scrub_replicated: add_something td/osd-scrub-repair csr_pool ROBJ19 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ19 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:13:12.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:13:12.412 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:13:12.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:13:12.617 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:13:12.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:13:12.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:13:12.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ19 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:13:12.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1010: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapheader ROBJ19 hdr-ROBJ19 2026-03-08T23:13:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1011: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ19 key-ROBJ19 val-ROBJ19 2026-03-08T23:13:12.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1015: TEST_corrupt_scrub_replicated: dd if=/dev/zero of=td/osd-scrub-repair/new.ROBJ19 bs=1024 count=1025 2026-03-08T23:13:12.707 INFO:tasks.workunit.client.0.vm03.stderr:1025+0 records in 2026-03-08T23:13:12.707 INFO:tasks.workunit.client.0.vm03.stderr:1025+0 records out 2026-03-08T23:13:12.707 INFO:tasks.workunit.client.0.vm03.stderr:1049600 bytes (1.0 MB, 1.0 MiB) copied, 0.00243206 s, 432 MB/s 2026-03-08T23:13:12.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1016: TEST_corrupt_scrub_replicated: rados --pool csr_pool put ROBJ19 td/osd-scrub-repair/new.ROBJ19 2026-03-08T23:13:12.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1017: TEST_corrupt_scrub_replicated: rm -f td/osd-scrub-repair/new.ROBJ19 2026-03-08T23:13:12.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1019: TEST_corrupt_scrub_replicated: get_pg csr_pool ROBJ0 2026-03-08T23:13:12.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=csr_pool 2026-03-08T23:13:12.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=ROBJ0 2026-03-08T23:13:12.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map csr_pool ROBJ0 2026-03-08T23:13:12.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:13:12.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1019: TEST_corrupt_scrub_replicated: local pg=3.0 2026-03-08T23:13:12.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1020: TEST_corrupt_scrub_replicated: get_primary csr_pool ROBJ0 2026-03-08T23:13:12.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=csr_pool 2026-03-08T23:13:12.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=ROBJ0 2026-03-08T23:13:12.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map csr_pool ROBJ0 2026-03-08T23:13:12.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1020: TEST_corrupt_scrub_replicated: local primary=1 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1023: TEST_corrupt_scrub_replicated: get_asok_path osd.0 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:13.106 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:13.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:13:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1023: TEST_corrupt_scrub_replicated: CEPH_ARGS= 2026-03-08T23:13:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1023: TEST_corrupt_scrub_replicated: ceph daemon /tmp/ceph-asok.43024/ceph-osd.0.asok config set osd_deep_scrub_update_digest_min_age 0 2026-03-08T23:13:13.168 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:13:13.168 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_deep_scrub_update_digest_min_age = '' (not observed, change may require restart) osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:13:13.168 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:13:13.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1025: TEST_corrupt_scrub_replicated: get_asok_path osd.1 2026-03-08T23:13:13.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:13:13.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1025: TEST_corrupt_scrub_replicated: CEPH_ARGS= 2026-03-08T23:13:13.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1025: TEST_corrupt_scrub_replicated: ceph daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set osd_deep_scrub_update_digest_min_age 0 2026-03-08T23:13:13.240 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:13:13.240 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_deep_scrub_update_digest_min_age = '' (not observed, change may require restart) osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:13:13.240 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:13:13.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1027: TEST_corrupt_scrub_replicated: pg_deep_scrub 3.0 2026-03-08T23:13:13.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=3.0 2026-03-08T23:13:13.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 3.0 2026-03-08T23:13:13.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:13:13.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:13:13.250 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:13.250 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:13.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:13.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:13.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:13.621 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:13.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:13.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:13.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:13.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836490 2026-03-08T23:13:13.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836490 2026-03-08T23:13:13.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490' 2026-03-08T23:13:13.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:13.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:13:13.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T23:13:13.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T23:13:13.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-42949672968' 2026-03-08T23:13:13.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:13.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836490 2026-03-08T23:13:13.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:13.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:13:13.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836490 2026-03-08T23:13:13.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:13.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836490 2026-03-08T23:13:13.790 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836490 2026-03-08T23:13:13.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836490' 2026-03-08T23:13:13.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:13.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836490 2026-03-08T23:13:13.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:14.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:13:14.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:15.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836490 2026-03-08T23:13:15.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:15.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T23:13:15.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:15.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:15.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T23:13:15.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:15.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T23:13:15.144 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672968 2026-03-08T23:13:15.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T23:13:15.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:15.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672968 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:13:15.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:15.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:15.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:15.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 3.0 2026-03-08T23:13:15.759 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0 on osd.1 to deep-scrub 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 3.0 2026-03-08T23:12:57.362278+0000 last_deep_scrub_stamp 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:13:15.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:13:15.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:15.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:15.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:15.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:15.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:57.362278+0000 '>' 2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:13:16.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:16.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:17.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:57.362278+0000 '>' 2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:17.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:13:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:13:18.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:13:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:18.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:18.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:57.362278+0000 '>' 2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:18.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:19.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:19.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:12:57.362278+0000 '>' 2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:19.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:13:20.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:13:20.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:13:20.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:12:57.362278+0000 2026-03-08T23:13:20.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:13:20.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: seq 1 19 2026-03-08T23:13:20.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:20.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ1 2026-03-08T23:13:20.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 1 % 2 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1038: TEST_corrupt_scrub_replicated: local payload=UVWXYZZZ 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1039: TEST_corrupt_scrub_replicated: echo UVWXYZZZ 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1040: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:13:20.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:20.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ1 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:21.701 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#-5:00000000:::scrub_3.0:head#, (61) No data available 2026-03-08T23:13:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:13:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:13:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:13:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:13:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:13:22.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:13:22.244 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:13:22.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:13:22.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:13:22.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:13:22.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:13:22.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:13:22.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:13:22.264 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:22.261+0000 7f864eca08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:22.264 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:22.265+0000 7f864eca08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:22.266 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:22.265+0000 7f864eca08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:22.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:23.228 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:23.229+0000 7f864eca08c0 -1 Falling back to public interface 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:23.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:23.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:24.478 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:24.477+0000 7f864eca08c0 -1 osd.1 61 log_to_monitors true 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:25.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:25.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:25.893+0000 7f8645c50640 -1 osd.1 61 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:26.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 65 up_thru 65 down_at 62 last_clean_interval [10,61) [v2:127.0.0.1:6810/4111382374,v1:127.0.0.1:6811/4111382374] [v2:127.0.0.1:6812/4111382374,v1:127.0.0.1:6813/4111382374] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:13:26.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:13:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:13:26.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:26.243 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:26.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:26.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:26.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:26.494 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:26.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:26.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:26.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836493 2026-03-08T23:13:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836493 2026-03-08T23:13:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493' 2026-03-08T23:13:26.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:26.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874242 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874242 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-279172874242' 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836493 2026-03-08T23:13:26.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:26.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:13:26.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836493 2026-03-08T23:13:26.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:26.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836493 2026-03-08T23:13:26.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836493' 2026-03-08T23:13:26.673 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836493 2026-03-08T23:13:26.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:26.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836493 2026-03-08T23:13:26.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:27.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:13:27.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:28.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836493 2026-03-08T23:13:28.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:29.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:13:29.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:29.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836493 2026-03-08T23:13:29.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:29.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-279172874242 2026-03-08T23:13:29.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:29.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:29.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-279172874242 2026-03-08T23:13:29.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:29.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874242 2026-03-08T23:13:29.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 279172874242' 2026-03-08T23:13:29.210 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 279172874242 2026-03-08T23:13:29.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:29.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874242 -lt 279172874242 2026-03-08T23:13:29.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:13:29.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:29.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:29.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:13:29.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:13:29.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:13:29.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:13:29.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:13:29.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:29.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:29.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:13:29.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:13:29.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:13:29.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:29.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ2 2026-03-08T23:13:29.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 2 % 2 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1045: TEST_corrupt_scrub_replicated: local payload=UVWXYZ 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1046: TEST_corrupt_scrub_replicated: echo UVWXYZ 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1047: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:13:30.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:13:30.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:13:30.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:13:30.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:13:30.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:13:30.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ2 set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:31.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:13:31.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:13:31.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:13:31.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:13:31.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:13:31.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:13:31.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:13:31.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:13:31.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:13:31.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:13:31.366 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:13:31.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:13:31.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:13:31.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:13:31.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:13:31.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:13:31.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:13:31.382 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:31.381+0000 7fc26dc068c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:31.383 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:31.385+0000 7fc26dc068c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:31.385 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:31.385+0000 7fc26dc068c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:31.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:31.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:32.584 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:32.585+0000 7fc26dc068c0 -1 Falling back to public interface 2026-03-08T23:13:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:13:32.722 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:13:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:32.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:33.811 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:33.813+0000 7fc26dc068c0 -1 osd.0 67 log_to_monitors true 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:33.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:34.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 71 up_thru 71 down_at 68 last_clean_interval [5,67) [v2:127.0.0.1:6802/1910478659,v1:127.0.0.1:6803/1910478659] [v2:127.0.0.1:6804/1910478659,v1:127.0.0.1:6805/1910478659] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:13:35.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:13:35.284 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:35.284 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:35.284 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:35.284 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:35.284 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:35.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:35.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:35.524 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:35.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:35.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:35.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:35.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678018 2026-03-08T23:13:35.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678018 2026-03-08T23:13:35.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-304942678018' 2026-03-08T23:13:35.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:35.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=279172874245 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 279172874245 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-304942678018 1-279172874245' 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-304942678018 2026-03-08T23:13:35.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:35.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:13:35.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-304942678018 2026-03-08T23:13:35.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:35.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678018 2026-03-08T23:13:35.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 304942678018' 2026-03-08T23:13:35.692 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 304942678018 2026-03-08T23:13:35.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 304942678018 2026-03-08T23:13:35.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:36.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:13:36.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:37.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678018 -lt 304942678018 2026-03-08T23:13:37.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:37.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-279172874245 2026-03-08T23:13:37.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:37.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:37.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-279172874245 2026-03-08T23:13:37.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:37.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=279172874245 2026-03-08T23:13:37.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 279172874245' 2026-03-08T23:13:37.044 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 279172874245 2026-03-08T23:13:37.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:37.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 279172874245 -lt 279172874245 2026-03-08T23:13:37.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:13:37.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:37.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:37.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:13:37.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:13:37.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:13:37.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:13:37.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:13:37.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:37.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ3 2026-03-08T23:13:37.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 3 % 2 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1052: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ3 remove 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ3 remove 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:13:37.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:13:37.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:13:37.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:13:37.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ3 remove 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:37.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ3 remove 2026-03-08T23:13:38.572 INFO:tasks.workunit.client.0.vm03.stdout:remove #3:f2a5b2a4:::ROBJ3:head# 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:13:39.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:39.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:13:39.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:13:39.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:13:39.105 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:13:39.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:13:39.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:13:39.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:13:39.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:13:39.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:13:39.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:13:39.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:39.121+0000 7fe7ded138c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:39.123 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:39.125+0000 7fe7ded138c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:39.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:39.125+0000 7fe7ded138c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:39.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:39.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:40.083 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:40.085+0000 7fe7ded138c0 -1 Falling back to public interface 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:40.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:40.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:41.081 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:41.081+0000 7fe7ded138c0 -1 osd.1 73 log_to_monitors true 2026-03-08T23:13:41.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:41.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:41.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:13:41.625 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:13:41.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:41.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:41.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:42.104 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:42.105+0000 7fe7d5cc3640 -1 osd.1 73 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:42.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 77 up_thru 77 down_at 74 last_clean_interval [65,73) [v2:127.0.0.1:6810/4140317191,v1:127.0.0.1:6811/4140317191] [v2:127.0.0.1:6812/4140317191,v1:127.0.0.1:6813/4140317191] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:13:42.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:13:42.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:42.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:42.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:42.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:42.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:43.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:43.230 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:43.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:43.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=304942678020 2026-03-08T23:13:43.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 304942678020 2026-03-08T23:13:43.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-304942678020' 2026-03-08T23:13:43.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:43.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:13:43.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481794 2026-03-08T23:13:43.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481794 2026-03-08T23:13:43.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-304942678020 1-330712481794' 2026-03-08T23:13:43.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:43.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-304942678020 2026-03-08T23:13:43.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:43.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:13:43.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-304942678020 2026-03-08T23:13:43.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:43.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=304942678020 2026-03-08T23:13:43.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 304942678020' 2026-03-08T23:13:43.403 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 304942678020 2026-03-08T23:13:43.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678019 -lt 304942678020 2026-03-08T23:13:43.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:44.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:13:44.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:44.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 304942678021 -lt 304942678020 2026-03-08T23:13:44.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:44.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-330712481794 2026-03-08T23:13:44.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:44.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:44.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-330712481794 2026-03-08T23:13:44.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:44.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481794 2026-03-08T23:13:44.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 330712481794' 2026-03-08T23:13:44.752 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 330712481794 2026-03-08T23:13:44.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:44.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481794 -lt 330712481794 2026-03-08T23:13:44.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:13:44.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:44.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:13:45.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:13:45.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:13:45.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:13:45.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:45.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ4 2026-03-08T23:13:45.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 4 % 2 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1057: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ4 set-omap key-ROBJ4 td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ4 set-omap key-ROBJ4 td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:13:45.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:13:45.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ4 set-omap key-ROBJ4 td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:13:45.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ4 set-omap key-ROBJ4 td/osd-scrub-repair/CORRUPT 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:13:46.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:46.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:13:46.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:13:46.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:13:46.793 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:13:46.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:13:46.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:13:46.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:13:46.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:13:46.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:13:46.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:13:46.811 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:46.809+0000 7fa9ec6f58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:46.811 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:46.813+0000 7fa9ec6f58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:46.812 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:46.813+0000 7fa9ec6f58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:46.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:46.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:47.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:47.772 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:47.773+0000 7fa9ec6f58c0 -1 Falling back to public interface 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:48.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:48.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:49.045 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:49.045+0000 7fa9ec6f58c0 -1 osd.0 78 log_to_monitors true 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:49.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:49.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:50.209 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:50.209+0000 7fa9e36a5640 -1 osd.0 78 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:50.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:13:50.670 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 82 up_thru 82 down_at 79 last_clean_interval [71,78) [v2:127.0.0.1:6802/2131937133,v1:127.0.0.1:6803/2131937133] [v2:127.0.0.1:6804/2131937133,v1:127.0.0.1:6805/2131937133] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:13:50.671 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:50.672 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:50.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:50.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:50.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:50.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:50.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:50.911 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:50.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:50.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:50.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:50.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318274 2026-03-08T23:13:50.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318274 2026-03-08T23:13:50.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274' 2026-03-08T23:13:50.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:50.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=330712481796 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 330712481796 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318274 1-330712481796' 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318274 2026-03-08T23:13:51.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:51.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:13:51.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318274 2026-03-08T23:13:51.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:51.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318274 2026-03-08T23:13:51.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318274' 2026-03-08T23:13:51.086 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318274 2026-03-08T23:13:51.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:51.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 352187318274 2026-03-08T23:13:51.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:52.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:13:52.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:52.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 352187318274 2026-03-08T23:13:52.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:13:53.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:13:53.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:13:53.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318274 -lt 352187318274 2026-03-08T23:13:53.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:13:53.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-330712481796 2026-03-08T23:13:53.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:13:53.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:13:53.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-330712481796 2026-03-08T23:13:53.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:13:53.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=330712481796 2026-03-08T23:13:53.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 330712481796' 2026-03-08T23:13:53.598 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 330712481796 2026-03-08T23:13:53.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:13:53.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 330712481797 -lt 330712481796 2026-03-08T23:13:53.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:13:53.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:53.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:13:53.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:13:54.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:13:54.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:13:54.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:13:54.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:13:54.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:13:54.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:13:54.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:13:54.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:13:54.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ5 2026-03-08T23:13:54.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 5 % 2 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1062: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ5 rm-omap key-ROBJ5 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ5 rm-omap key-ROBJ5 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:13:54.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:13:54.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:13:54.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:13:54.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ5 rm-omap key-ROBJ5 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:54.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ5 rm-omap key-ROBJ5 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:13:55.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:13:55.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:13:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:13:55.692 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:13:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:13:55.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:13:55.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:13:55.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:13:55.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:13:55.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:13:55.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:55.709+0000 7fe5b46a08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:55.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:55.709+0000 7fe5b46a08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:55.710 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:55.709+0000 7fe5b46a08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:55.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:56.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:56.668 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:56.669+0000 7fe5b46a08c0 -1 Falling back to public interface 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:57.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:57.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:57.648 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:13:57.649+0000 7fe5b46a08c0 -1 osd.1 84 log_to_monitors true 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:58.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:58.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:13:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 88 up_thru 88 down_at 85 last_clean_interval [77,84) [v2:127.0.0.1:6810/975491952,v1:127.0.0.1:6811/975491952] [v2:127.0.0.1:6812/975491952,v1:127.0.0.1:6813/975491952] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:13:59.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:13:59.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:13:59.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:13:59.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:13:59.895 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:13:59.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:13:59.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:59.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:13:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=352187318277 2026-03-08T23:13:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 352187318277 2026-03-08T23:13:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318277' 2026-03-08T23:13:59.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:13:59.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=377957122050 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 377957122050 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-352187318277 1-377957122050' 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-352187318277 2026-03-08T23:14:00.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:00.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:00.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-352187318277 2026-03-08T23:14:00.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:00.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=352187318277 2026-03-08T23:14:00.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 352187318277' 2026-03-08T23:14:00.069 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 352187318277 2026-03-08T23:14:00.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:00.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318275 -lt 352187318277 2026-03-08T23:14:00.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:01.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:01.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:01.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 352187318277 -lt 352187318277 2026-03-08T23:14:01.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:01.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-377957122050 2026-03-08T23:14:01.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:01.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:01.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-377957122050 2026-03-08T23:14:01.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:01.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=377957122050 2026-03-08T23:14:01.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 377957122050' 2026-03-08T23:14:01.423 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 377957122050 2026-03-08T23:14:01.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:01.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 377957122050 -lt 377957122050 2026-03-08T23:14:01.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:01.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:01.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:01.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:01.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:01.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:01.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:01.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:02.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:02.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:02.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:02.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:02.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ6 2026-03-08T23:14:02.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 6 % 2 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1067: TEST_corrupt_scrub_replicated: echo extra 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1068: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ6 set-omap key2-ROBJ6 td/osd-scrub-repair/extra-val 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ6 set-omap key2-ROBJ6 td/osd-scrub-repair/extra-val 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:02.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ6 set-omap key2-ROBJ6 td/osd-scrub-repair/extra-val 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:02.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ6 set-omap key2-ROBJ6 td/osd-scrub-repair/extra-val 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:03.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:03.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:14:03.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:14:03.492 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:14:03.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:03.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:14:03.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:14:03.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:03.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:03.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:03.508 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:03.509+0000 7f1958d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:03.509 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:03.509+0000 7f1958d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:03.511 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:03.513+0000 7f1958d518c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:03.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:03.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:03.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:04.711 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:04.713+0000 7f1958d518c0 -1 Falling back to public interface 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:04.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:05.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:05.965 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:05.965+0000 7f1958d518c0 -1 osd.0 89 log_to_monitors true 2026-03-08T23:14:06.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:06.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:06.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:06.034 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:06.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:06.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:06.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:07.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:07.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:07.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:14:07.217 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:14:07.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:07.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:07.401 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 93 up_thru 93 down_at 90 last_clean_interval [82,89) [v2:127.0.0.1:6802/1997048331,v1:127.0.0.1:6803/1997048331] [v2:127.0.0.1:6804/1997048331,v1:127.0.0.1:6805/1997048331] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:07.402 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:07.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:07.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:07.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:07.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:07.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:07.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:07.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:07.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:07.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:07.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:07.637 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:07.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:07.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:07.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:07.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=399431958530 2026-03-08T23:14:07.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 399431958530 2026-03-08T23:14:07.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-399431958530' 2026-03-08T23:14:07.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:07.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=377957122053 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 377957122053 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-399431958530 1-377957122053' 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-399431958530 2026-03-08T23:14:07.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:07.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:07.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-399431958530 2026-03-08T23:14:07.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:07.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=399431958530 2026-03-08T23:14:07.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 399431958530' 2026-03-08T23:14:07.810 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 399431958530 2026-03-08T23:14:07.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:07.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 399431958530 2026-03-08T23:14:07.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:08.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958530 -lt 399431958530 2026-03-08T23:14:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:09.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-377957122053 2026-03-08T23:14:09.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:09.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:09.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-377957122053 2026-03-08T23:14:09.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:09.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=377957122053 2026-03-08T23:14:09.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 377957122053' 2026-03-08T23:14:09.168 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 377957122053 2026-03-08T23:14:09.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:09.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 377957122053 -lt 377957122053 2026-03-08T23:14:09.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:09.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:09.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:09.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:09.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:09.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:09.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:09.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:09.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:09.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:09.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:09.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1069: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/extra-val 2026-03-08T23:14:09.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:09.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ7 2026-03-08T23:14:09.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 7 % 2 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1074: TEST_corrupt_scrub_replicated: echo -n newheader 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1075: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ7 set-omaphdr td/osd-scrub-repair/hdr 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ7 set-omaphdr td/osd-scrub-repair/hdr 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:09.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:09.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ7 set-omaphdr td/osd-scrub-repair/hdr 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:10.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ7 set-omaphdr td/osd-scrub-repair/hdr 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:11.267 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:14:11.268 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:14:11.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:11.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:14:11.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:14:11.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:11.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:11.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:11.284 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:11.285+0000 7faf23b5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:11.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:11.285+0000 7faf23b5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:11.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:11.285+0000 7faf23b5f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:11.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:11.736 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:11.737+0000 7faf23b5f8c0 -1 Falling back to public interface 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:12.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:12.772 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:12.773+0000 7faf23b5f8c0 -1 osd.1 95 log_to_monitors true 2026-03-08T23:14:12.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:13.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:13.832 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:13.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 99 up_thru 94 down_at 96 last_clean_interval [88,95) [v2:127.0.0.1:6810/1071131483,v1:127.0.0.1:6811/1071131483] [v2:127.0.0.1:6812/1071131483,v1:127.0.0.1:6813/1071131483] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:14.024 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:14.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:14.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:14.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:14.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:14.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:14.316 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:14.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:14.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:14.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:14.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=399431958532 2026-03-08T23:14:14.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 399431958532 2026-03-08T23:14:14.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-399431958532' 2026-03-08T23:14:14.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:14.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762306 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762306 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-399431958532 1-425201762306' 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-399431958532 2026-03-08T23:14:14.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:14.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:14.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-399431958532 2026-03-08T23:14:14.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:14.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=399431958532 2026-03-08T23:14:14.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 399431958532' 2026-03-08T23:14:14.490 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 399431958532 2026-03-08T23:14:14.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:14.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958531 -lt 399431958532 2026-03-08T23:14:14.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:15.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:15.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:15.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958531 -lt 399431958532 2026-03-08T23:14:15.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:16.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:14:16.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:17.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 399431958533 -lt 399431958532 2026-03-08T23:14:17.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:17.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-425201762306 2026-03-08T23:14:17.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:17.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:17.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-425201762306 2026-03-08T23:14:17.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:17.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762306 2026-03-08T23:14:17.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 425201762306' 2026-03-08T23:14:17.039 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 425201762306 2026-03-08T23:14:17.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:17.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762306 -lt 425201762306 2026-03-08T23:14:17.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:17.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:17.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:17.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:17.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:17.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:17.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:17.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1076: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/hdr 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:17.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ8 2026-03-08T23:14:17.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 8 % 2 2026-03-08T23:14:17.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:14:17.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:17.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1080: TEST_corrupt_scrub_replicated: rados --pool csr_pool setxattr ROBJ8 key1-ROBJ8 val1-ROBJ8 2026-03-08T23:14:17.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1081: TEST_corrupt_scrub_replicated: rados --pool csr_pool setxattr ROBJ8 key2-ROBJ8 val2-ROBJ8 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1084: TEST_corrupt_scrub_replicated: echo -n bad-val 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1085: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ8 set-attr _key1-ROBJ8 td/osd-scrub-repair/bad-val 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ8 set-attr _key1-ROBJ8 td/osd-scrub-repair/bad-val 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:17.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:14:17.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:17.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:17.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:17.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:17.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ8 set-attr _key1-ROBJ8 td/osd-scrub-repair/bad-val 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:18.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ8 set-attr _key1-ROBJ8 td/osd-scrub-repair/bad-val 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:19.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:19.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:14:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:19.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:14:19.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:14:19.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:19.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:19.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:19.363 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:19.361+0000 7fcaee77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:19.363 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:19.365+0000 7fcaee77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:19.365 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:19.365+0000 7fcaee77e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:19.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:19.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:20.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:20.573+0000 7fcaee77e8c0 -1 Falling back to public interface 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:20.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:20.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:21.547 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:21.549+0000 7fcaee77e8c0 -1 osd.0 101 log_to_monitors true 2026-03-08T23:14:21.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:21.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:21.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:21.881 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:21.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:21.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:22.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:23.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:23.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:23.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:14:23.059 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:14:23.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:23.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:23.219 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 105 up_thru 105 down_at 102 last_clean_interval [93,101) [v2:127.0.0.1:6802/3542111811,v1:127.0.0.1:6803/3542111811] [v2:127.0.0.1:6804/3542111811,v1:127.0.0.1:6805/3542111811] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:14:23.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:23.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:23.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:23.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:23.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:23.443 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:23.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:23.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:23.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:23.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=450971566082 2026-03-08T23:14:23.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 450971566082 2026-03-08T23:14:23.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566082' 2026-03-08T23:14:23.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:23.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:23.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=425201762309 2026-03-08T23:14:23.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 425201762309 2026-03-08T23:14:23.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566082 1-425201762309' 2026-03-08T23:14:23.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:23.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-450971566082 2026-03-08T23:14:23.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:23.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:23.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-450971566082 2026-03-08T23:14:23.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:23.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=450971566082 2026-03-08T23:14:23.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 450971566082' 2026-03-08T23:14:23.601 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 450971566082 2026-03-08T23:14:23.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:23.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 450971566082 2026-03-08T23:14:23.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:24.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:24.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:24.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566082 -lt 450971566082 2026-03-08T23:14:24.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:24.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-425201762309 2026-03-08T23:14:24.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:24.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:24.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-425201762309 2026-03-08T23:14:24.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:24.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=425201762309 2026-03-08T23:14:24.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 425201762309' 2026-03-08T23:14:24.934 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 425201762309 2026-03-08T23:14:24.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:25.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 425201762309 -lt 425201762309 2026-03-08T23:14:25.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:25.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:25.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:25.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:25.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:25.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:25.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:25.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:25.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:25.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1086: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ8 rm-attr _key2-ROBJ8 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ8 rm-attr _key2-ROBJ8 2026-03-08T23:14:25.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:25.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:25.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ8 rm-attr _key2-ROBJ8 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:25.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ8 rm-attr _key2-ROBJ8 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:27.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:27.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:27.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:14:27.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:14:27.377 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:14:27.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:27.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:14:27.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:14:27.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:27.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:27.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:27.393 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:27.393+0000 7fa38af358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:27.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:27.401+0000 7fa38af358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:27.401 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:27.401+0000 7fa38af358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:14:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:14:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:27.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:27.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:27.559 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:27.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:27.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:27.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:28.355 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:28.357+0000 7fa38af358c0 -1 Falling back to public interface 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:28.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:29.337 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:29.337+0000 7fa38af358c0 -1 osd.1 106 log_to_monitors true 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:29.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:30.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:30.214 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:30.213+0000 7fa381ee5640 -1 osd.1 106 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:14:31.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:31.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:31.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:14:31.096 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:14:31.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:31.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:31.263 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 110 up_thru 110 down_at 107 last_clean_interval [99,106) [v2:127.0.0.1:6810/1645161104,v1:127.0.0.1:6811/1645161104] [v2:127.0.0.1:6812/1645161104,v1:127.0.0.1:6813/1645161104] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:14:31.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:31.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:31.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:31.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:31.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:31.490 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:31.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:31.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:31.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:31.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=450971566084 2026-03-08T23:14:31.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 450971566084 2026-03-08T23:14:31.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566084' 2026-03-08T23:14:31.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:31.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:31.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=472446402562 2026-03-08T23:14:31.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 472446402562 2026-03-08T23:14:31.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566084 1-472446402562' 2026-03-08T23:14:31.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:31.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-450971566084 2026-03-08T23:14:31.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:31.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-450971566084 2026-03-08T23:14:31.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:31.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=450971566084 2026-03-08T23:14:31.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 450971566084' 2026-03-08T23:14:31.664 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 450971566084 2026-03-08T23:14:31.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:31.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566083 -lt 450971566084 2026-03-08T23:14:31.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:32.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:32.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:33.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566085 -lt 450971566084 2026-03-08T23:14:33.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:33.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-472446402562 2026-03-08T23:14:33.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:33.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:33.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-472446402562 2026-03-08T23:14:33.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:33.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=472446402562 2026-03-08T23:14:33.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 472446402562' 2026-03-08T23:14:33.009 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 472446402562 2026-03-08T23:14:33.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:33.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 472446402562 -lt 472446402562 2026-03-08T23:14:33.178 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:33.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:33.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:33.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:33.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:33.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:33.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:33.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:33.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:33.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:33.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:33.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:33.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:33.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:33.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1087: TEST_corrupt_scrub_replicated: echo -n val3-ROBJ8 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1088: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ8 set-attr _key3-ROBJ8 td/osd-scrub-repair/newval 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ8 set-attr _key3-ROBJ8 td/osd-scrub-repair/newval 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:33.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ8 set-attr _key3-ROBJ8 td/osd-scrub-repair/newval 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:33.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ8 set-attr _key3-ROBJ8 td/osd-scrub-repair/newval 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:35.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:14:35.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:14:35.064 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:14:35.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:35.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:14:35.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:14:35.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:35.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:35.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:35.082 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:35.081+0000 7f87ea1468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:35.082 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:35.081+0000 7f87ea1468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:35.083 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:35.085+0000 7f87ea1468c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:35.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:35.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:36.283 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:36.285+0000 7f87ea1468c0 -1 Falling back to public interface 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:36.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:36.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:37.510 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:37.509+0000 7f87ea1468c0 -1 osd.1 112 log_to_monitors true 2026-03-08T23:14:37.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:37.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:37.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:37.588 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:37.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:37.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:37.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:38.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:38.939 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 116 up_thru 116 down_at 113 last_clean_interval [110,112) [v2:127.0.0.1:6810/14170075,v1:127.0.0.1:6811/14170075] [v2:127.0.0.1:6812/14170075,v1:127.0.0.1:6813/14170075] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:38.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:39.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:39.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:39.191 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:39.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:39.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:39.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:39.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=450971566087 2026-03-08T23:14:39.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 450971566087 2026-03-08T23:14:39.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566087' 2026-03-08T23:14:39.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:39.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:39.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=498216206338 2026-03-08T23:14:39.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 498216206338 2026-03-08T23:14:39.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566087 1-498216206338' 2026-03-08T23:14:39.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:39.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-450971566087 2026-03-08T23:14:39.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:39.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:39.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-450971566087 2026-03-08T23:14:39.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:39.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=450971566087 2026-03-08T23:14:39.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 450971566087' 2026-03-08T23:14:39.361 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 450971566087 2026-03-08T23:14:39.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:39.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566086 -lt 450971566087 2026-03-08T23:14:39.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:40.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:40.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:40.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566087 -lt 450971566087 2026-03-08T23:14:40.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:40.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-498216206338 2026-03-08T23:14:40.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:40.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:40.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-498216206338 2026-03-08T23:14:40.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:40.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=498216206338 2026-03-08T23:14:40.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 498216206338' 2026-03-08T23:14:40.697 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 498216206338 2026-03-08T23:14:40.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:40.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 498216206338 -lt 498216206338 2026-03-08T23:14:40.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:40.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:40.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:41.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:41.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:41.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:41.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:41.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:41.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1089: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/bad-val td/osd-scrub-repair/newval 2026-03-08T23:14:41.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:41.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ9 2026-03-08T23:14:41.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 9 % 2 2026-03-08T23:14:41.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:14:41.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1093: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ9 get-attr _ 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ9 get-attr _ 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:41.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ9 get-attr _ 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:41.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ9 get-attr _ 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:42.204 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:42.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:14:42.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:14:42.207 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:14:42.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:42.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:14:42.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:14:42.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:42.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:42.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:42.225 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:42.225+0000 7fa705baa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:42.225 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:42.225+0000 7fa705baa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:42.227 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:42.225+0000 7fa705baa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:42.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:42.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:43.423 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:43.425+0000 7fa705baa8c0 -1 Falling back to public interface 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:43.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:44.470 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:44.469+0000 7fa705baa8c0 -1 osd.1 117 log_to_monitors true 2026-03-08T23:14:44.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:44.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:44.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:44.766 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:44.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:44.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:44.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:45.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 121 up_thru 121 down_at 118 last_clean_interval [116,117) [v2:127.0.0.1:6810/1401246397,v1:127.0.0.1:6811/1401246397] [v2:127.0.0.1:6812/1401246397,v1:127.0.0.1:6813/1401246397] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:46.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:46.144 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:46.144 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:46.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:46.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:46.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:46.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:46.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:46.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:46.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:46.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:46.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:46.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:46.403 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:46.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:46.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:46.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=450971566089 2026-03-08T23:14:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 450971566089 2026-03-08T23:14:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566089' 2026-03-08T23:14:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:46.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=519691042818 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 519691042818 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566089 1-519691042818' 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-450971566089 2026-03-08T23:14:46.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:46.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:46.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-450971566089 2026-03-08T23:14:46.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:46.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=450971566089 2026-03-08T23:14:46.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 450971566089' 2026-03-08T23:14:46.588 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 450971566089 2026-03-08T23:14:46.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:46.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566088 -lt 450971566089 2026-03-08T23:14:46.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:47.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:47.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:47.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566088 -lt 450971566089 2026-03-08T23:14:47.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:48.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:14:48.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:49.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566090 -lt 450971566089 2026-03-08T23:14:49.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:49.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-519691042818 2026-03-08T23:14:49.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:49.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:49.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-519691042818 2026-03-08T23:14:49.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:49.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=519691042818 2026-03-08T23:14:49.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 519691042818' 2026-03-08T23:14:49.135 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 519691042818 2026-03-08T23:14:49.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:49.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 519691042818 -lt 519691042818 2026-03-08T23:14:49.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:49.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:49.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:49.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:49.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:49.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:49.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:49.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:49.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:49.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:49.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1094: TEST_corrupt_scrub_replicated: echo -n D 2026-03-08T23:14:49.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1095: TEST_corrupt_scrub_replicated: rados --pool csr_pool put ROBJ9 td/osd-scrub-repair/change 2026-03-08T23:14:49.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1096: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:49.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:50.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:50.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:51.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:51.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:14:51.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:14:51.257 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:14:51.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:51.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:14:51.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:14:51.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:51.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:51.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:51.273 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:51.273+0000 7f51a73308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:51.273 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:51.273+0000 7f51a73308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:51.275 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:51.277+0000 7f51a73308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:51.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:51.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:51.979 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:51.981+0000 7f51a73308c0 -1 Falling back to public interface 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:52.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:52.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:14:53.006 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:53.005+0000 7f51a73308c0 -1 osd.1 123 log_to_monitors true 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:53.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:14:54.007 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 127 up_thru 127 down_at 124 last_clean_interval [121,123) [v2:127.0.0.1:6810/3397879421,v1:127.0.0.1:6811/3397879421] [v2:127.0.0.1:6812/3397879421,v1:127.0.0.1:6813/3397879421] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:14:54.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:14:54.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:14:54.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:14:54.255 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:14:54.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:14:54.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:54.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:14:54.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=450971566092 2026-03-08T23:14:54.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 450971566092 2026-03-08T23:14:54.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566092' 2026-03-08T23:14:54.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:14:54.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=545460846594 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 545460846594 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-450971566092 1-545460846594' 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-450971566092 2026-03-08T23:14:54.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:54.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:14:54.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-450971566092 2026-03-08T23:14:54.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:54.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=450971566092 2026-03-08T23:14:54.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 450971566092' 2026-03-08T23:14:54.421 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 450971566092 2026-03-08T23:14:54.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:14:54.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 450971566092 -lt 450971566092 2026-03-08T23:14:54.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:14:54.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-545460846594 2026-03-08T23:14:54.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:14:54.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:14:54.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-545460846594 2026-03-08T23:14:54.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:14:54.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=545460846594 2026-03-08T23:14:54.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 545460846594' 2026-03-08T23:14:54.601 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 545460846594 2026-03-08T23:14:54.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:54.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 545460846594 2026-03-08T23:14:54.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:55.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:14:55.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:55.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 545460846594 2026-03-08T23:14:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:14:56.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:14:56.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:14:57.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 545460846594 -lt 545460846594 2026-03-08T23:14:57.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:14:57.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:57.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:57.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:14:57.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:14:57.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:14:57.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:14:57.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:14:57.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:14:57.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:14:57.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:14:57.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:14:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:14:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:14:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:14:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:14:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:14:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:14:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1097: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/oi td/osd-scrub-repair/change 2026-03-08T23:14:57.723 INFO:tasks.workunit.client.0.vm03.stderr:rm: cannot remove 'td/osd-scrub-repair/oi': No such file or directory 2026-03-08T23:14:57.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:57.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ10 2026-03-08T23:14:57.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 10 % 2 2026-03-08T23:14:57.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:14:57.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:57.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:57.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ11 2026-03-08T23:14:57.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 11 % 2 2026-03-08T23:14:57.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:14:57.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:57.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:57.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ12 2026-03-08T23:14:57.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 12 % 2 2026-03-08T23:14:57.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:14:57.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:57.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:57.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ13 2026-03-08T23:14:57.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 13 % 2 2026-03-08T23:14:57.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:14:57.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:57.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:14:57.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ14 2026-03-08T23:14:57.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 14 % 2 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1106: TEST_corrupt_scrub_replicated: echo -n bad-val 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1107: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ14 set-attr _ td/osd-scrub-repair/bad-val 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ14 set-attr _ td/osd-scrub-repair/bad-val 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:14:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ14 set-attr _ td/osd-scrub-repair/bad-val 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:57.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ14 set-attr _ td/osd-scrub-repair/bad-val 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:14:59.042 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:14:59.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:14:59.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:14:59.044 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:14:59.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:14:59.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:14:59.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:14:59.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:14:59.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:14:59.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:14:59.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:59.061+0000 7fbc92f6d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:59.062 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:59.065+0000 7fbc92f6d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:59.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:14:59.065+0000 7fbc92f6d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:14:59.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:14:59.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:00.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:00.030+0000 7fbc92f6d8c0 -1 Falling back to public interface 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:00.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:00.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:01.012 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:01.014+0000 7fbc92f6d8c0 -1 osd.0 129 log_to_monitors true 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:01.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:01.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:01.889 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:01.890+0000 7fbc89f1d640 -1 osd.0 129 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:02.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 133 up_thru 133 down_at 130 last_clean_interval [105,129) [v2:127.0.0.1:6802/848371209,v1:127.0.0.1:6803/848371209] [v2:127.0.0.1:6804/848371209,v1:127.0.0.1:6805/848371209] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:02.901 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:02.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:02.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:02.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:02.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:03.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:03.126 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:03.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:03.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:03.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:03.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=571230650370 2026-03-08T23:15:03.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 571230650370 2026-03-08T23:15:03.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650370' 2026-03-08T23:15:03.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:03.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:03.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=545460846597 2026-03-08T23:15:03.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 545460846597 2026-03-08T23:15:03.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650370 1-545460846597' 2026-03-08T23:15:03.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:03.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-571230650370 2026-03-08T23:15:03.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:03.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:03.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-571230650370 2026-03-08T23:15:03.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:03.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=571230650370 2026-03-08T23:15:03.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 571230650370' 2026-03-08T23:15:03.299 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 571230650370 2026-03-08T23:15:03.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:03.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 571230650370 2026-03-08T23:15:03.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:04.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:04.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:04.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650370 -lt 571230650370 2026-03-08T23:15:04.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:04.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-545460846597 2026-03-08T23:15:04.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:04.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-545460846597 2026-03-08T23:15:04.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:04.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=545460846597 2026-03-08T23:15:04.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 545460846597' 2026-03-08T23:15:04.630 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 545460846597 2026-03-08T23:15:04.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:04.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 545460846597 -lt 545460846597 2026-03-08T23:15:04.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:04.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:04.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:04.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:04.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:04.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:05.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:05.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:05.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:05.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1108: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ14 rm-attr _ 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ14 rm-attr _ 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ14 rm-attr _ 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:05.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ14 rm-attr _ 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:06.658 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:06.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:06.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:06.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:15:06.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:15:06.661 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:15:06.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:06.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:15:06.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:15:06.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:06.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:06.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:06.677 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:06.678+0000 7f83c3ee78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:06.677 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:06.678+0000 7f83c3ee78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:06.679 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:06.682+0000 7f83c3ee78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:06.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:07.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:07.635 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:07.638+0000 7f83c3ee78c0 -1 Falling back to public interface 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:08.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:08.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:08.886 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:08.890+0000 7f83c3ee78c0 -1 osd.1 134 log_to_monitors true 2026-03-08T23:15:09.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:09.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:09.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:09.175 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:09.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:09.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:09.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:09.933 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:09.934+0000 7f83bae97640 -1 osd.1 134 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:10.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 138 up_thru 138 down_at 135 last_clean_interval [127,134) [v2:127.0.0.1:6810/3143189469,v1:127.0.0.1:6811/3143189469] [v2:127.0.0.1:6812/3143189469,v1:127.0.0.1:6813/3143189469] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:10.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:10.515 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:10.515 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:10.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:10.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:10.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:10.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:10.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:10.735 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:10.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:10.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:10.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:10.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=571230650372 2026-03-08T23:15:10.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 571230650372 2026-03-08T23:15:10.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650372' 2026-03-08T23:15:10.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:10.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:10.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=592705486850 2026-03-08T23:15:10.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 592705486850 2026-03-08T23:15:10.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650372 1-592705486850' 2026-03-08T23:15:10.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:10.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-571230650372 2026-03-08T23:15:10.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:10.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:10.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-571230650372 2026-03-08T23:15:10.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:10.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=571230650372 2026-03-08T23:15:10.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 571230650372' 2026-03-08T23:15:10.893 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 571230650372 2026-03-08T23:15:10.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:11.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650371 -lt 571230650372 2026-03-08T23:15:11.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:12.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:12.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:12.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650371 -lt 571230650372 2026-03-08T23:15:12.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:13.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:15:13.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:13.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650373 -lt 571230650372 2026-03-08T23:15:13.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:13.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-592705486850 2026-03-08T23:15:13.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:13.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:13.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-592705486850 2026-03-08T23:15:13.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:13.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=592705486850 2026-03-08T23:15:13.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 592705486850' 2026-03-08T23:15:13.380 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 592705486850 2026-03-08T23:15:13.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:13.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 592705486850 -lt 592705486850 2026-03-08T23:15:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:13.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:13.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:13.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:13.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:13.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:13.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:13.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:13.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:13.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:13.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:13.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:13.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:14.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:14.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1109: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/bad-val 2026-03-08T23:15:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:15:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ15 2026-03-08T23:15:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 15 % 2 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1113: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ15 rm-attr _ 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ15 rm-attr _ 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:14.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:15:14.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:14.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:14.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:14.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:14.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ15 rm-attr _ 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:14.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ15 rm-attr _ 2026-03-08T23:15:14.907 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:15.443 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:15:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:15:15.445 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:15:15.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:15.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:15:15.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:15:15.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:15.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:15.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:15.462 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:15.462+0000 7f3042cb08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:15.462 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:15.466+0000 7f3042cb08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:15.464 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:15.466+0000 7f3042cb08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:15.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:15.622 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:15.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:15.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:16.427 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:16.430+0000 7f3042cb08c0 -1 Falling back to public interface 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:16.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:16.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:17.662 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:17.662+0000 7f3042cb08c0 -1 osd.1 140 log_to_monitors true 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:18.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 144 up_thru 144 down_at 141 last_clean_interval [138,140) [v2:127.0.0.1:6810/3921138850,v1:127.0.0.1:6811/3921138850] [v2:127.0.0.1:6812/3921138850,v1:127.0.0.1:6813/3921138850] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:19.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:19.298 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:19.298 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:19.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:19.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:19.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:19.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:19.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:19.525 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:19.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:19.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:19.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:19.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=571230650375 2026-03-08T23:15:19.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 571230650375 2026-03-08T23:15:19.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650375' 2026-03-08T23:15:19.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:19.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=618475290626 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 618475290626 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-571230650375 1-618475290626' 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-571230650375 2026-03-08T23:15:19.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:19.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:19.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-571230650375 2026-03-08T23:15:19.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:19.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=571230650375 2026-03-08T23:15:19.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 571230650375' 2026-03-08T23:15:19.679 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 571230650375 2026-03-08T23:15:19.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:19.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650374 -lt 571230650375 2026-03-08T23:15:19.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:20.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:20.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:21.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 571230650375 -lt 571230650375 2026-03-08T23:15:21.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:21.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-618475290626 2026-03-08T23:15:21.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:21.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:21.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-618475290626 2026-03-08T23:15:21.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:21.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=618475290626 2026-03-08T23:15:21.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 618475290626' 2026-03-08T23:15:21.017 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 618475290626 2026-03-08T23:15:21.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:21.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 618475290626 -lt 618475290626 2026-03-08T23:15:21.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:21.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:21.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:21.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:21.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:21.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:21.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:21.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:21.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:21.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ16 2026-03-08T23:15:21.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 16 % 2 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1117: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ16 rm-attr snapset 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ16 rm-attr snapset 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:21.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:21.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ16 rm-attr snapset 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ16 rm-attr snapset 2026-03-08T23:15:22.555 INFO:tasks.workunit.client.0.vm03.stderr:Error decoding attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (22) Invalid argument 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:23.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:15:23.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:15:23.092 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:15:23.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:23.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:15:23.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:15:23.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:23.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:23.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:23.108 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:23.110+0000 7f9458d5d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:23.109 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:23.110+0000 7f9458d5d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:23.118 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:23.114+0000 7f9458d5d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:23.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:24.063 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:24.066+0000 7f9458d5d8c0 -1 Falling back to public interface 2026-03-08T23:15:24.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:24.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:24.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:24.471 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:24.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:24.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:24.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:25.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:25.034+0000 7f9458d5d8c0 -1 osd.0 145 log_to_monitors true 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:25.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:25.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:26.985 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 149 up_thru 149 down_at 146 last_clean_interval [133,145) [v2:127.0.0.1:6802/2277499862,v1:127.0.0.1:6803/2277499862] [v2:127.0.0.1:6804/2277499862,v1:127.0.0.1:6805/2277499862] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:26.986 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:26.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:26.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:26.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:27.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:27.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:27.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:27.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:27.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:27.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:27.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:27.229 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:27.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:27.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:27.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=639950127106 2026-03-08T23:15:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 639950127106 2026-03-08T23:15:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-639950127106' 2026-03-08T23:15:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:27.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:27.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=618475290628 2026-03-08T23:15:27.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 618475290628 2026-03-08T23:15:27.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-639950127106 1-618475290628' 2026-03-08T23:15:27.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:27.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-639950127106 2026-03-08T23:15:27.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:27.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:27.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-639950127106 2026-03-08T23:15:27.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:27.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=639950127106 2026-03-08T23:15:27.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 639950127106' 2026-03-08T23:15:27.388 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 639950127106 2026-03-08T23:15:27.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 639950127106 2026-03-08T23:15:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:28.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:28.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:28.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 639950127106 -lt 639950127106 2026-03-08T23:15:28.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:28.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-618475290628 2026-03-08T23:15:28.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:28.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:28.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-618475290628 2026-03-08T23:15:28.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:28.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=618475290628 2026-03-08T23:15:28.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 618475290628' 2026-03-08T23:15:28.737 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 618475290628 2026-03-08T23:15:28.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:28.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 618475290629 -lt 618475290628 2026-03-08T23:15:28.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:28.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:28.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:29.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:29.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:29.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:29.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:29.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:29.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:29.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1118: TEST_corrupt_scrub_replicated: echo -n bad-val 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1119: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ16 set-attr snapset td/osd-scrub-repair/bad-val 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ16 set-attr snapset td/osd-scrub-repair/bad-val 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:29.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:15:29.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:29.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:29.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ16 set-attr snapset td/osd-scrub-repair/bad-val 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:29.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ16 set-attr snapset td/osd-scrub-repair/bad-val 2026-03-08T23:15:30.210 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:15:30.210 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:30.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:15:30.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:15:30.748 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:15:30.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:30.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:15:30.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:15:30.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:30.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:30.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:30.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:30.766+0000 7f4bc221c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:30.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:30.766+0000 7f4bc221c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:30.765 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:30.766+0000 7f4bc221c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:31.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:31.719 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:31.722+0000 7f4bc221c8c0 -1 Falling back to public interface 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:32.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:32.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:32.715 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:32.718+0000 7f4bc221c8c0 -1 osd.1 150 log_to_monitors true 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:33.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:33.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:34.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:34.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:34.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:34.478 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:34.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:34.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 154 up_thru 154 down_at 151 last_clean_interval [144,150) [v2:127.0.0.1:6810/3382760002,v1:127.0.0.1:6811/3382760002] [v2:127.0.0.1:6812/3382760002,v1:127.0.0.1:6813/3382760002] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:34.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:34.681 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:34.681 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:34.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:34.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:34.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:34.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:34.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:34.936 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:34.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:34.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:34.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:35.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=639950127108 2026-03-08T23:15:35.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 639950127108 2026-03-08T23:15:35.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-639950127108' 2026-03-08T23:15:35.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:35.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:35.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=661424963586 2026-03-08T23:15:35.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 661424963586 2026-03-08T23:15:35.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-639950127108 1-661424963586' 2026-03-08T23:15:35.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:35.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-639950127108 2026-03-08T23:15:35.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:35.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:35.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-639950127108 2026-03-08T23:15:35.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=639950127108 2026-03-08T23:15:35.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 639950127108' 2026-03-08T23:15:35.115 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 639950127108 2026-03-08T23:15:35.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:35.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 639950127107 -lt 639950127108 2026-03-08T23:15:35.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:36.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:36.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:36.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 639950127107 -lt 639950127108 2026-03-08T23:15:36.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:37.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:15:37.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:37.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 639950127109 -lt 639950127108 2026-03-08T23:15:37.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:37.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-661424963586 2026-03-08T23:15:37.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:37.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:37.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-661424963586 2026-03-08T23:15:37.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:37.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=661424963586 2026-03-08T23:15:37.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 661424963586' 2026-03-08T23:15:37.649 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 661424963586 2026-03-08T23:15:37.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:37.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 661424963586 -lt 661424963586 2026-03-08T23:15:37.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:37.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:37.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:38.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:38.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:38.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:38.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:38.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:38.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ17 2026-03-08T23:15:38.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 17 % 2 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1124: TEST_corrupt_scrub_replicated: local payload=ROBJ17 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1125: TEST_corrupt_scrub_replicated: echo ROBJ17 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1126: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:38.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:39.618 INFO:tasks.workunit.client.0.vm03.stderr:Error decoding attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (22) Invalid argument 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:40.155 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:40.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:15:40.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:15:40.157 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:15:40.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:40.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:15:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:15:40.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:40.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:40.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:40.176 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:40.178+0000 7fe5a37058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:40.176 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:40.178+0000 7fe5a37058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:40.178 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:40.178+0000 7fe5a37058c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:40.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:40.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:41.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:41.402+0000 7fe5a37058c0 -1 Falling back to public interface 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:41.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:41.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:42.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:42.514+0000 7fe5a37058c0 -1 osd.0 155 log_to_monitors true 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:42.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:43.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:44.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:44.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:44.823 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:44.826+0000 7fe59a6b5640 -1 osd.0 155 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:15:45.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:45.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:45.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:15:45.255 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:15:45.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:45.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:45.422 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 159 up_thru 159 down_at 156 last_clean_interval [149,155) [v2:127.0.0.1:6802/1643983150,v1:127.0.0.1:6803/1643983150] [v2:127.0.0.1:6804/1643983150,v1:127.0.0.1:6805/1643983150] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:45.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:45.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:45.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:45.649 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:45.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:45.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:45.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:45.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=682899800066 2026-03-08T23:15:45.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 682899800066 2026-03-08T23:15:45.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-682899800066' 2026-03-08T23:15:45.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:45.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:45.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=661424963589 2026-03-08T23:15:45.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 661424963589 2026-03-08T23:15:45.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-682899800066 1-661424963589' 2026-03-08T23:15:45.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:45.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-682899800066 2026-03-08T23:15:45.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:45.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:45.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-682899800066 2026-03-08T23:15:45.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:45.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=682899800066 2026-03-08T23:15:45.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 682899800066' 2026-03-08T23:15:45.863 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 682899800066 2026-03-08T23:15:45.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:46.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 682899800066 2026-03-08T23:15:46.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:47.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:47.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:47.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 682899800066 -lt 682899800066 2026-03-08T23:15:47.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:47.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-661424963589 2026-03-08T23:15:47.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:47.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:47.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-661424963589 2026-03-08T23:15:47.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:47.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=661424963589 2026-03-08T23:15:47.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 661424963589' 2026-03-08T23:15:47.190 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 661424963589 2026-03-08T23:15:47.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:47.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 661424963589 -lt 661424963589 2026-03-08T23:15:47.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:47.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:47.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:47.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:47.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:47.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:47.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:47.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:47.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:47.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1127: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:47.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:48.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ17 set-bytes td/osd-scrub-repair/new.ROBJ17 2026-03-08T23:15:48.983 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:15:48.983 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:15:49.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:49.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:49.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:15:49.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:15:49.520 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:15:49.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:49.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:15:49.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:15:49.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:49.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:49.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:49.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:49.538+0000 7f3baacfc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:49.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:49.538+0000 7f3baacfc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:49.539 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:49.542+0000 7f3baacfc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:49.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:49.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:50.247 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:50.250+0000 7f3baacfc8c0 -1 Falling back to public interface 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:50.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:51.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:51.460 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:51.462+0000 7f3baacfc8c0 -1 osd.1 161 log_to_monitors true 2026-03-08T23:15:52.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:52.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:52.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:52.056 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:52.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:52.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:52.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:53.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 165 up_thru 165 down_at 162 last_clean_interval [154,161) [v2:127.0.0.1:6810/2047642912,v1:127.0.0.1:6811/2047642912] [v2:127.0.0.1:6812/2047642912,v1:127.0.0.1:6813/2047642912] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:15:53.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:15:53.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:15:53.387 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:15:53.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:15:53.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:15:53.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:15:53.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:15:53.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:15:53.614 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:15:53.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:15:53.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:53.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:15:53.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=682899800069 2026-03-08T23:15:53.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 682899800069 2026-03-08T23:15:53.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-682899800069' 2026-03-08T23:15:53.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:15:53.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=708669603842 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 708669603842 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-682899800069 1-708669603842' 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-682899800069 2026-03-08T23:15:53.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:53.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:15:53.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-682899800069 2026-03-08T23:15:53.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:53.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=682899800069 2026-03-08T23:15:53.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 682899800069' 2026-03-08T23:15:53.779 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 682899800069 2026-03-08T23:15:53.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:53.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 682899800067 -lt 682899800069 2026-03-08T23:15:53.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:15:54.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:15:54.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:15:55.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 682899800069 -lt 682899800069 2026-03-08T23:15:55.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:15:55.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-708669603842 2026-03-08T23:15:55.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:15:55.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:15:55.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-708669603842 2026-03-08T23:15:55.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:15:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=708669603842 2026-03-08T23:15:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 708669603842' 2026-03-08T23:15:55.117 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 708669603842 2026-03-08T23:15:55.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:15:55.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 708669603842 -lt 708669603842 2026-03-08T23:15:55.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:15:55.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:55.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:15:55.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:15:55.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:15:55.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:15:55.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:15:55.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:15:55.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ18 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 18 % 2 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=0 2026-03-08T23:15:55.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1132: TEST_corrupt_scrub_replicated: local payload=ROBJ18 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1133: TEST_corrupt_scrub_replicated: echo ROBJ18 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1134: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:15:55.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:15:55.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:55.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:15:56.627 INFO:tasks.workunit.client.0.vm03.stderr:Error decoding attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (22) Invalid argument 2026-03-08T23:15:57.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:15:57.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:15:57.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:15:57.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:15:57.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:15:57.168 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:15:57.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:15:57.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:15:57.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:15:57.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:15:57.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:15:57.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:15:57.186 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:57.186+0000 7f2d8b9968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:57.187 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:57.190+0000 7f2d8b9968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:57.188 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:57.190+0000 7f2d8b9968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:57.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:57.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:58.398 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:58.402+0000 7f2d8b9968c0 -1 Falling back to public interface 2026-03-08T23:15:58.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:58.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:58.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:15:58.520 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:15:58.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:58.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:58.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:15:59.377 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:15:59.378+0000 7f2d8b9968c0 -1 osd.0 167 log_to_monitors true 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:15:59.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:15:59.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:00.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:01.014 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 171 up_thru 171 down_at 168 last_clean_interval [159,167) [v2:127.0.0.1:6802/1237552737,v1:127.0.0.1:6803/1237552737] [v2:127.0.0.1:6804/1237552737,v1:127.0.0.1:6805/1237552737] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:16:01.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:01.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:01.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:01.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:01.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:01.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:01.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:01.251 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:01.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:01.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:01.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:01.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=734439407618 2026-03-08T23:16:01.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 734439407618 2026-03-08T23:16:01.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407618' 2026-03-08T23:16:01.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:01.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=708669603844 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 708669603844 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407618 1-708669603844' 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-734439407618 2026-03-08T23:16:01.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:01.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:01.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-734439407618 2026-03-08T23:16:01.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:01.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=734439407618 2026-03-08T23:16:01.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 734439407618' 2026-03-08T23:16:01.421 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 734439407618 2026-03-08T23:16:01.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:01.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 734439407618 2026-03-08T23:16:01.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:02.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:02.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:02.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407618 -lt 734439407618 2026-03-08T23:16:02.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:02.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-708669603844 2026-03-08T23:16:02.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:02.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:02.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-708669603844 2026-03-08T23:16:02.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:02.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=708669603844 2026-03-08T23:16:02.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 708669603844' 2026-03-08T23:16:02.768 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 708669603844 2026-03-08T23:16:02.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:02.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 708669603845 -lt 708669603844 2026-03-08T23:16:02.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:02.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:02.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:03.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:03.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:03.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:03.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:03.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:03.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1135: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:03.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:03.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ18 set-bytes td/osd-scrub-repair/new.ROBJ18 2026-03-08T23:16:04.283 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:16:04.284 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:04.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:04.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:04.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:16:04.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:16:04.820 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:16:04.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:04.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:16:04.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:16:04.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:04.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:04.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:04.836 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:04.838+0000 7f43e5bb38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:04.837 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:04.838+0000 7f43e5bb38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:04.838 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:04.838+0000 7f43e5bb38c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:04.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:05.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:06.038 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:06.042+0000 7f43e5bb38c0 -1 Falling back to public interface 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:06.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:07.045 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:07.046+0000 7f43e5bb38c0 -1 osd.1 172 log_to_monitors true 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:07.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:07.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:08.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 176 up_thru 176 down_at 173 last_clean_interval [165,172) [v2:127.0.0.1:6810/1545126076,v1:127.0.0.1:6811/1545126076] [v2:127.0.0.1:6812/1545126076,v1:127.0.0.1:6813/1545126076] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:08.664 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:08.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:08.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:08.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:08.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:08.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:08.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:08.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:08.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:08.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:08.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:08.908 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:08.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:08.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:08.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:08.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=734439407620 2026-03-08T23:16:08.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 734439407620 2026-03-08T23:16:08.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407620' 2026-03-08T23:16:08.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:08.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=755914244098 2026-03-08T23:16:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 755914244098 2026-03-08T23:16:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407620 1-755914244098' 2026-03-08T23:16:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:09.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-734439407620 2026-03-08T23:16:09.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:09.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:09.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-734439407620 2026-03-08T23:16:09.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:09.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=734439407620 2026-03-08T23:16:09.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 734439407620' 2026-03-08T23:16:09.067 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 734439407620 2026-03-08T23:16:09.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:09.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407619 -lt 734439407620 2026-03-08T23:16:09.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:10.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:10.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:10.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407619 -lt 734439407620 2026-03-08T23:16:10.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:11.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:16:11.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:11.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407621 -lt 734439407620 2026-03-08T23:16:11.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:11.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-755914244098 2026-03-08T23:16:11.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:11.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:11.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-755914244098 2026-03-08T23:16:11.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:11.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=755914244098 2026-03-08T23:16:11.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 755914244098' 2026-03-08T23:16:11.587 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 755914244098 2026-03-08T23:16:11.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:11.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 755914244098 -lt 755914244098 2026-03-08T23:16:11.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:11.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:11.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:11.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:12.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:12.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:12.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:12.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1137: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ18 corrupt-info 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ18 corrupt-info 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:12.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ18 corrupt-info 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:12.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ18 corrupt-info 2026-03-08T23:16:13.122 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:16:13.122 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:13.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:13.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:16:13.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:16:13.655 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:16:13.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:13.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:16:13.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:16:13.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:13.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:13.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:13.673 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:13.674+0000 7fba31e0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:13.673 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:13.674+0000 7fba31e0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:13.675 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:13.678+0000 7fba31e0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:13.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:13.847 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:13.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:13.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:14.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:14.623 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:14.626+0000 7fba31e0f8c0 -1 Falling back to public interface 2026-03-08T23:16:15.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:15.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:15.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:15.031 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:15.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:15.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:15.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:15.615 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:15.618+0000 7fba31e0f8c0 -1 osd.1 177 log_to_monitors true 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:16.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:16.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:16.637 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:16.638+0000 7fba28dbf640 -1 osd.1 177 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:17.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:17.561 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 181 up_thru 181 down_at 178 last_clean_interval [176,177) [v2:127.0.0.1:6810/487639942,v1:127.0.0.1:6811/487639942] [v2:127.0.0.1:6812/487639942,v1:127.0.0.1:6813/487639942] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:16:17.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:17.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:17.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:17.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:17.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:17.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:17.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:17.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:17.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:17.806 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:17.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:17.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:17.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:17.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=734439407623 2026-03-08T23:16:17.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 734439407623 2026-03-08T23:16:17.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407623' 2026-03-08T23:16:17.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:17.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=777389080578 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 777389080578 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407623 1-777389080578' 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-734439407623 2026-03-08T23:16:17.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:17.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:17.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-734439407623 2026-03-08T23:16:17.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=734439407623 2026-03-08T23:16:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 734439407623' 2026-03-08T23:16:17.972 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 734439407623 2026-03-08T23:16:17.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407622 -lt 734439407623 2026-03-08T23:16:18.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:19.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:19.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407623 -lt 734439407623 2026-03-08T23:16:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:19.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-777389080578 2026-03-08T23:16:19.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:19.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:19.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-777389080578 2026-03-08T23:16:19.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:19.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=777389080578 2026-03-08T23:16:19.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 777389080578' 2026-03-08T23:16:19.311 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 777389080578 2026-03-08T23:16:19.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:19.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 777389080578 -lt 777389080578 2026-03-08T23:16:19.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:19.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:19.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:19.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:19.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:19.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:19.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:19.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:19.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:19.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:19.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:19.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:19.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:19.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:19.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1029: TEST_corrupt_scrub_replicated: for i in $(seq 1 $total_objs) 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1030: TEST_corrupt_scrub_replicated: objname=ROBJ19 2026-03-08T23:16:20.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: expr 19 % 2 2026-03-08T23:16:20.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1033: TEST_corrupt_scrub_replicated: local osd=1 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1035: TEST_corrupt_scrub_replicated: case $i in 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1146: TEST_corrupt_scrub_replicated: get_pg csr_pool ROBJ0 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=csr_pool 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=ROBJ0 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map csr_pool ROBJ0 2026-03-08T23:16:20.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:16:20.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1146: TEST_corrupt_scrub_replicated: local pg=3.0 2026-03-08T23:16:20.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1148: TEST_corrupt_scrub_replicated: ceph tell 'osd.*' injectargs -- --osd-max-object-size=1048576 2026-03-08T23:16:20.269 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: {} 2026-03-08T23:16:20.269 INFO:tasks.workunit.client.0.vm03.stderr:osd.0: osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_max_object_size = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' 2026-03-08T23:16:20.278 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: {} 2026-03-08T23:16:20.278 INFO:tasks.workunit.client.0.vm03.stderr:osd.1: osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_max_object_size = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1150: TEST_corrupt_scrub_replicated: inject_eio rep data csr_pool ROBJ11 td/osd-scrub-repair 0 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ11 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ11 2026-03-08T23:16:20.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:20.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ11 2026-03-08T23:16:20.289 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ11 2026-03-08T23:16:20.289 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:20.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:20.457 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:20.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:20.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/1/type 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:20.459 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:20.460 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:20.460 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:20.460 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:20.460 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:20.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:20.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:20.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:20.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:20.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.1.asok injectdataerr csr_pool ROBJ11 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1151: TEST_corrupt_scrub_replicated: inject_eio rep mdata csr_pool ROBJ12 td/osd-scrub-repair 1 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=mdata 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ12 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ12 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:20.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ12 2026-03-08T23:16:20.601 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ12 2026-03-08T23:16:20.601 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:20.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=0 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:20.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/0/type 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 0 bluestore_debug_inject_read_err true 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=0 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:20.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.0 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:20.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:20.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.0.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:20.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:20.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:20.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.0 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:20.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.0.asok injectmdataerr csr_pool ROBJ12 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1152: TEST_corrupt_scrub_replicated: inject_eio rep mdata csr_pool ROBJ13 td/osd-scrub-repair 1 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=mdata 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ13 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ13 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:20.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ13 2026-03-08T23:16:20.911 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ13 2026-03-08T23:16:20.911 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=0 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:21.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/0/type 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 0 bluestore_debug_inject_read_err true 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=0 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:21.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.0 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:21.085 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:21.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:21.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.0.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:21.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:21.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.0 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:21.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.0.asok injectmdataerr csr_pool ROBJ13 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1153: TEST_corrupt_scrub_replicated: inject_eio rep data csr_pool ROBJ13 td/osd-scrub-repair 0 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ13 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ13 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:21.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ13 2026-03-08T23:16:21.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ13 2026-03-08T23:16:21.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:21.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:21.393 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:21.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:21.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/1/type 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:21.395 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:21.396 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:21.396 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:21.396 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:21.396 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:21.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:21.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:21.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:21.463 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:21.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:21.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:21.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.1.asok injectdataerr csr_pool ROBJ13 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1155: TEST_corrupt_scrub_replicated: pg_scrub 3.0 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=3.0 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 3.0 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:21.527 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:21.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:21.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:21.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:21.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:21.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:21.843 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:21.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:21.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:21.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:21.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=734439407625 2026-03-08T23:16:21.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 734439407625 2026-03-08T23:16:21.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407625' 2026-03-08T23:16:21.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:21.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=777389080580 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 777389080580 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-734439407625 1-777389080580' 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-734439407625 2026-03-08T23:16:21.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:21.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:21.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-734439407625 2026-03-08T23:16:21.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:22.000 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 734439407625 2026-03-08T23:16:22.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=734439407625 2026-03-08T23:16:22.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 734439407625' 2026-03-08T23:16:22.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:22.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407624 -lt 734439407625 2026-03-08T23:16:22.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:23.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:23.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:23.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407625 -lt 734439407625 2026-03-08T23:16:23.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:23.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-777389080580 2026-03-08T23:16:23.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:23.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:23.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-777389080580 2026-03-08T23:16:23.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:23.326 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 777389080580 2026-03-08T23:16:23.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=777389080580 2026-03-08T23:16:23.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 777389080580' 2026-03-08T23:16:23.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 777389080580 -lt 777389080580 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:16:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 3.0 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:16:23.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:16:23.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:23.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 3.0 2026-03-08T23:16:23.862 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0 on osd.1 to scrub 2026-03-08T23:16:23.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 3.0 2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:23.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:16:23.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:23.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:16:23.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:16:23.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:16:24.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:24.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:16:25.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:16:25.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:16:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:16:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:16:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:16:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:16:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:16:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:25.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:16:26.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:16:26.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:16:26.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:16:26.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:16:26.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:16:26.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:16:26.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:16:26.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:26.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:16:27.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:16:27.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:16:27.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:16:27.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:16:27.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:16:27.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:16:27.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:16:24.561153+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1157: TEST_corrupt_scrub_replicated: ERRORS=0 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1158: TEST_corrupt_scrub_replicated: declare -a s_err_strings 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1159: TEST_corrupt_scrub_replicated: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:30259878:::ROBJ15:head : candidate had a missing info key' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1160: TEST_corrupt_scrub_replicated: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:33aca486:::ROBJ18:head : object info inconsistent ' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1161: TEST_corrupt_scrub_replicated: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:5c7b2c47:::ROBJ16:head : candidate had a corrupt snapset' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1162: TEST_corrupt_scrub_replicated: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:5c7b2c47:::ROBJ16:head : candidate had a missing snapset key' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1163: TEST_corrupt_scrub_replicated: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:5c7b2c47:::ROBJ16:head : failed to pick suitable object info' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1164: TEST_corrupt_scrub_replicated: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:86586531:::ROBJ8:head : attr value mismatch '\''_key1-ROBJ8'\'', attr name mismatch '\''_key3-ROBJ8'\'', attr name mismatch '\''_key2-ROBJ8'\''' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1165: TEST_corrupt_scrub_replicated: err_strings[6]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:bc819597:::ROBJ12:head : candidate had a stat error' 2026-03-08T23:16:27.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1166: TEST_corrupt_scrub_replicated: err_strings[7]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:c0c86b1d:::ROBJ14:head : candidate had a missing info key' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1167: TEST_corrupt_scrub_replicated: err_strings[8]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:c0c86b1d:::ROBJ14:head : candidate had a corrupt info' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1168: TEST_corrupt_scrub_replicated: err_strings[9]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:c0c86b1d:::ROBJ14:head : failed to pick suitable object info' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1169: TEST_corrupt_scrub_replicated: err_strings[10]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : candidate size 9 info size 7 mismatch' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1170: TEST_corrupt_scrub_replicated: err_strings[11]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : size 9 != size 7 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from shard 0' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1171: TEST_corrupt_scrub_replicated: err_strings[12]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:d60617f9:::ROBJ13:head : candidate had a stat error' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1172: TEST_corrupt_scrub_replicated: err_strings[13]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 3:f2a5b2a4:::ROBJ3:head : missing' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1173: TEST_corrupt_scrub_replicated: err_strings[14]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ffdb2004:::ROBJ9:head : candidate size 1 info size 7 mismatch' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1174: TEST_corrupt_scrub_replicated: err_strings[15]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ffdb2004:::ROBJ9:head : object info inconsistent ' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1175: TEST_corrupt_scrub_replicated: err_strings[16]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 3:c0c86b1d:::ROBJ14:head : no '\''_'\'' attr' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1176: TEST_corrupt_scrub_replicated: err_strings[17]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 3:5c7b2c47:::ROBJ16:head : can'\''t decode '\''snapset'\'' attr .* v=3 cannot decode .* Malformed input' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1177: TEST_corrupt_scrub_replicated: err_strings[18]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub : stat mismatch, got 19/19 objects, 0/0 clones, 18/19 dirty, 18/19 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 1049713/1049720 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1178: TEST_corrupt_scrub_replicated: err_strings[19]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 1 missing, 8 inconsistent objects' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1179: TEST_corrupt_scrub_replicated: err_strings[20]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 18 errors' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1180: TEST_corrupt_scrub_replicated: err_strings[21]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:123a5f55:::ROBJ19:head : size 1049600 > 1048576 is too large' 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:30259878:::ROBJ15:head : candidate had a missing info key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:33aca486:::ROBJ18:head : object info inconsistent ' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:5c7b2c47:::ROBJ16:head : candidate had a corrupt snapset' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:5c7b2c47:::ROBJ16:head : candidate had a missing snapset key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:5c7b2c47:::ROBJ16:head : failed to pick suitable object info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:86586531:::ROBJ8:head : attr value mismatch '\''_key1-ROBJ8'\'', attr name mismatch '\''_key3-ROBJ8'\'', attr name mismatch '\''_key2-ROBJ8'\''' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:bc819597:::ROBJ12:head : candidate had a stat error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:c0c86b1d:::ROBJ14:head : candidate had a missing info key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:c0c86b1d:::ROBJ14:head : candidate had a corrupt info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:c0c86b1d:::ROBJ14:head : failed to pick suitable object info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : candidate size 9 info size 7 mismatch' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : size 9 != size 7 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from shard 0' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:d60617f9:::ROBJ13:head : candidate had a stat error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 3:f2a5b2a4:::ROBJ3:head : missing' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ffdb2004:::ROBJ9:head : candidate size 1 info size 7 mismatch' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ffdb2004:::ROBJ9:head : object info inconsistent ' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 3:c0c86b1d:::ROBJ14:head : no '\''_'\'' attr' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 3:5c7b2c47:::ROBJ16:head : can'\''t decode '\''snapset'\'' attr .* v=3 cannot decode .* Malformed input' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub : stat mismatch, got 19/19 objects, 0/0 clones, 18/19 dirty, 18/19 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 1049713/1049720 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 1 missing, 8 inconsistent objects' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 18 errors' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1182: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:16:27.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1184: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:123a5f55:::ROBJ19:head : size 1049600 > 1048576 is too large' td/osd-scrub-repair/osd.1.log 2026-03-08T23:16:27.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1191: TEST_corrupt_scrub_replicated: rados list-inconsistent-pg csr_pool 2026-03-08T23:16:27.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1193: TEST_corrupt_scrub_replicated: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:16:27.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1193: TEST_corrupt_scrub_replicated: test 1 = 1 2026-03-08T23:16:27.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1195: TEST_corrupt_scrub_replicated: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:16:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1195: TEST_corrupt_scrub_replicated: test 3.0 = 3.0 2026-03-08T23:16:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1197: TEST_corrupt_scrub_replicated: rados list-inconsistent-obj 3.0 2026-03-08T23:16:27.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1199: TEST_corrupt_scrub_replicated: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1199: TEST_corrupt_scrub_replicated: epoch=181 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1201: TEST_corrupt_scrub_replicated: jq 'def walk(f): 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:16:27.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1201: TEST_corrupt_scrub_replicated: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:16:27.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:1201: TEST_corrupt_scrub_replicated: jq .inconsistents 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2082: TEST_corrupt_scrub_replicated: jq 'def walk(f): 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:16:27.707 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:16:27.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2082: TEST_corrupt_scrub_replicated: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:16:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2082: TEST_corrupt_scrub_replicated: jq .inconsistents 2026-03-08T23:16:27.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2083: TEST_corrupt_scrub_replicated: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:16:27.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:16:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2084: TEST_corrupt_scrub_replicated: test no = yes 2026-03-08T23:16:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2089: TEST_corrupt_scrub_replicated: test '' = yes 2026-03-08T23:16:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2094: TEST_corrupt_scrub_replicated: objname=ROBJ9 2026-03-08T23:16:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2096: TEST_corrupt_scrub_replicated: echo -n ZZZ 2026-03-08T23:16:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2097: TEST_corrupt_scrub_replicated: rados --pool csr_pool put ROBJ9 td/osd-scrub-repair/change 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2099: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:27.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:16:27.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:27.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:27.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:27.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:27.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:16:27.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ9 set-attr _ td/osd-scrub-repair/robj9-oi 2026-03-08T23:16:28.529 INFO:tasks.workunit.client.0.vm03.stderr:Error decoding attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (22) Invalid argument 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:29.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:16:29.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:16:29.063 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:16:29.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:29.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:16:29.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:16:29.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:29.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:29.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:29.077 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:29.078+0000 7f7d00b388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:29.082 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:29.086+0000 7f7d00b388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:29.084 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:29.086+0000 7f7d00b388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:29.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:29.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:29.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:29.526 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:29.530+0000 7f7d00b388c0 -1 Falling back to public interface 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:30.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:30.532 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:30.534+0000 7f7d00b388c0 -1 osd.0 183 log_to_monitors true 2026-03-08T23:16:30.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:31.272 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:31.274+0000 7f7cf7ae8640 -1 osd.0 183 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:31.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:31.747 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 187 up_thru 187 down_at 184 last_clean_interval [171,183) [v2:127.0.0.1:6802/3376045296,v1:127.0.0.1:6803/3376045296] [v2:127.0.0.1:6804/3376045296,v1:127.0.0.1:6805/3376045296] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:31.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:31.749 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:31.749 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:31.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:31.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:31.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:31.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:31.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:31.981 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:31.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:31.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:31.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:32.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=803158884354 2026-03-08T23:16:32.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 803158884354 2026-03-08T23:16:32.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-803158884354' 2026-03-08T23:16:32.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:32.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=777389080583 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 777389080583 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-803158884354 1-777389080583' 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-803158884354 2026-03-08T23:16:32.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:32.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:32.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-803158884354 2026-03-08T23:16:32.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:32.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=803158884354 2026-03-08T23:16:32.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 803158884354' 2026-03-08T23:16:32.137 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 803158884354 2026-03-08T23:16:32.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:32.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 734439407626 -lt 803158884354 2026-03-08T23:16:32.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:33.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:33.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:33.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 803158884354 -lt 803158884354 2026-03-08T23:16:33.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:33.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-777389080583 2026-03-08T23:16:33.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:33.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:33.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-777389080583 2026-03-08T23:16:33.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:33.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=777389080583 2026-03-08T23:16:33.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 777389080583' 2026-03-08T23:16:33.462 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 777389080583 2026-03-08T23:16:33.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:33.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 777389080583 -lt 777389080583 2026-03-08T23:16:33.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:33.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:33.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:33.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:34.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:34.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:34.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:34.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:34.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:34.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:34.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:34.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2100: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/oi td/osd-scrub-repair/change 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:rm: cannot remove 'td/osd-scrub-repair/oi': No such file or directory 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2102: TEST_corrupt_scrub_replicated: objname=ROBJ10 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2103: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ10 get-attr _ 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ10 get-attr _ 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:34.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:34.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:34.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:34.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ10 get-attr _ 2026-03-08T23:16:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:34.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:16:34.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:34.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:34.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ10 get-attr _ 2026-03-08T23:16:34.670 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:16:34.670 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:34.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:34.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:16:34.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:16:34.956 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:16:34.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:34.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:16:34.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:16:34.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:34.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:34.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:34.974 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:34.974+0000 7fc95a5ba8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:34.974 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:34.978+0000 7fc95a5ba8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:34.976 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:34.978+0000 7fc95a5ba8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:35.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:35.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:35.934 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:35.938+0000 7fc95a5ba8c0 -1 Falling back to public interface 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:36.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:36.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:37.191 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:37.194+0000 7fc95a5ba8c0 -1 osd.1 188 log_to_monitors true 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:37.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:37.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:38.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:38.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:38.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:16:38.664 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:16:38.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:38.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 192 up_thru 192 down_at 189 last_clean_interval [181,188) [v2:127.0.0.1:6810/2934944215,v1:127.0.0.1:6811/2934944215] [v2:127.0.0.1:6812/2934944215,v1:127.0.0.1:6813/2934944215] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:38.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:38.832 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:38.832 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:38.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:38.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:38.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:38.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:39.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:39.066 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:39.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:39.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:39.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:39.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=803158884356 2026-03-08T23:16:39.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 803158884356 2026-03-08T23:16:39.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-803158884356' 2026-03-08T23:16:39.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:39.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=824633720834 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 824633720834 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-803158884356 1-824633720834' 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-803158884356 2026-03-08T23:16:39.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:39.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:39.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-803158884356 2026-03-08T23:16:39.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:39.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=803158884356 2026-03-08T23:16:39.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 803158884356' 2026-03-08T23:16:39.235 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 803158884356 2026-03-08T23:16:39.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:39.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 803158884355 -lt 803158884356 2026-03-08T23:16:39.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:40.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:40.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:40.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 803158884356 -lt 803158884356 2026-03-08T23:16:40.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:40.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-824633720834 2026-03-08T23:16:40.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:40.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-824633720834 2026-03-08T23:16:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:40.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=824633720834 2026-03-08T23:16:40.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 824633720834' 2026-03-08T23:16:40.602 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 824633720834 2026-03-08T23:16:40.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:40.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 824633720834 -lt 824633720834 2026-03-08T23:16:40.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:40.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:40.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:40.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:40.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:41.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:41.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:41.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:41.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:41.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:41.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:41.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:41.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2104: TEST_corrupt_scrub_replicated: rados --pool csr_pool setomapval ROBJ10 key2-ROBJ10 val2-ROBJ10 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2105: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:41.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:16:41.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:42.177 INFO:tasks.workunit.client.0.vm03.stderr:Error decoding attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (22) Invalid argument 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:42.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:42.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:16:42.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:16:42.711 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:16:42.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:42.711 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:16:42.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:16:42.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:42.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:42.728 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:42.730+0000 7fc53a1d78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:42.728 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:42.730+0000 7fc53a1d78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:42.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:42.730+0000 7fc53a1d78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:42.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:43.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:43.942 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:43.946+0000 7fc53a1d78c0 -1 Falling back to public interface 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:44.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:44.937 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:44.942+0000 7fc53a1d78c0 -1 osd.0 194 log_to_monitors true 2026-03-08T23:16:45.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:45.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:45.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:45.251 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:45.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:45.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:45.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:45.718 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:45.723+0000 7fc531187640 -1 osd.0 194 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:46.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:16:46.611 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 198 up_thru 198 down_at 195 last_clean_interval [187,194) [v2:127.0.0.1:6802/2257006380,v1:127.0.0.1:6803/2257006380] [v2:127.0.0.1:6804/2257006380,v1:127.0.0.1:6805/2257006380] exists,up b8b43426-8172-41a4-ad76-a875625b04e6 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:46.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:46.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:46.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:46.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:46.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:46.858 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:46.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:46.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:46.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:46.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=850403524610 2026-03-08T23:16:46.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 850403524610 2026-03-08T23:16:46.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524610' 2026-03-08T23:16:46.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:46.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=824633720836 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 824633720836 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524610 1-824633720836' 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-850403524610 2026-03-08T23:16:47.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:47.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:47.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-850403524610 2026-03-08T23:16:47.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:47.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=850403524610 2026-03-08T23:16:47.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 850403524610' 2026-03-08T23:16:47.042 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 850403524610 2026-03-08T23:16:47.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:47.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 850403524610 2026-03-08T23:16:47.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:48.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:48.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:48.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 850403524610 2026-03-08T23:16:48.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:49.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:16:49.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:49.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524610 -lt 850403524610 2026-03-08T23:16:49.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:49.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-824633720836 2026-03-08T23:16:49.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:49.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:49.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-824633720836 2026-03-08T23:16:49.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:49.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=824633720836 2026-03-08T23:16:49.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 824633720836' 2026-03-08T23:16:49.571 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 824633720836 2026-03-08T23:16:49.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:49.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 824633720837 -lt 824633720836 2026-03-08T23:16:49.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:49.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:49.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:49.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:50.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:50.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:50.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:50.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2106: TEST_corrupt_scrub_replicated: objectstore_tool td/osd-scrub-repair 1 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:16:50.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:16:50.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:16:50.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:16:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:16:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:16:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:16:50.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:16:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:16:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:50.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 ROBJ10 set-attr _ td/osd-scrub-repair/oi 2026-03-08T23:16:51.107 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:30259878:::ROBJ15:head#, (61) No data available 2026-03-08T23:16:51.107 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 3.0_head,#3:c0c86b1d:::ROBJ14:head#, (61) No data available 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:16:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:16:51.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:16:51.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:16:51.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:16:51.639 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:16:51.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:16:51.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:16:51.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:16:51.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:16:51.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:16:51.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:16:51.657 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:51.659+0000 7f79817c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:51.657 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:51.663+0000 7f79817c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:51.659 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:51.663+0000 7f79817c78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:51.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:51.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:52.114 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:52.119+0000 7f79817c78c0 -1 Falling back to public interface 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:52.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:53.119 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:53.123+0000 7f79817c78c0 -1 osd.1 199 log_to_monitors true 2026-03-08T23:16:53.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:16:53.890 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:16:53.895+0000 7f7978777640 -1 osd.1 199 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:16:54.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 203 up_thru 203 down_at 200 last_clean_interval [192,199) [v2:127.0.0.1:6810/3802862204,v1:127.0.0.1:6811/3802862204] [v2:127.0.0.1:6812/3802862204,v1:127.0.0.1:6813/3802862204] exists,up f603f62e-24cf-4fca-903d-50e492fba08d 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:16:54.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:16:54.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:54.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:54.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:54.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:54.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:54.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:54.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:54.594 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:54.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:54.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:54.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:54.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=850403524612 2026-03-08T23:16:54.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 850403524612 2026-03-08T23:16:54.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524612' 2026-03-08T23:16:54.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:54.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=871878361090 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 871878361090 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524612 1-871878361090' 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-850403524612 2026-03-08T23:16:54.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:54.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:54.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-850403524612 2026-03-08T23:16:54.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:54.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=850403524612 2026-03-08T23:16:54.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 850403524612' 2026-03-08T23:16:54.769 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 850403524612 2026-03-08T23:16:54.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:54.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524611 -lt 850403524612 2026-03-08T23:16:54.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:55.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:16:55.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:56.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524611 -lt 850403524612 2026-03-08T23:16:56.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:16:57.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:16:57.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:16:57.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524613 -lt 850403524612 2026-03-08T23:16:57.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:57.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-871878361090 2026-03-08T23:16:57.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:57.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:16:57.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-871878361090 2026-03-08T23:16:57.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:57.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=871878361090 2026-03-08T23:16:57.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 871878361090' 2026-03-08T23:16:57.293 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 871878361090 2026-03-08T23:16:57.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:16:57.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 871878361090 -lt 871878361090 2026-03-08T23:16:57.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:16:57.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:57.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:16:57.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:16:57.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:16:57.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:16:57.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:16:57.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:16:57.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:16:57.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:16:58.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:16:58.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:16:58.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:16:58.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2107: TEST_corrupt_scrub_replicated: rm td/osd-scrub-repair/oi 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2109: TEST_corrupt_scrub_replicated: inject_eio rep data csr_pool ROBJ11 td/osd-scrub-repair 0 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ11 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ11 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ11 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ11 2026-03-08T23:16:58.047 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:58.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:58.227 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:58.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:58.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:58.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/1/type 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:58.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.229 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.230 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:58.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:58.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:58.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:58.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.1.asok injectdataerr csr_pool ROBJ11 2026-03-08T23:16:58.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2110: TEST_corrupt_scrub_replicated: inject_eio rep mdata csr_pool ROBJ12 td/osd-scrub-repair 1 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=mdata 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ12 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ12 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ12 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ12 2026-03-08T23:16:58.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=0 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:58.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/0/type 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 0 bluestore_debug_inject_read_err true 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=0 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:58.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.0 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.536 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.537 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:58.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.0.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:58.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:58.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.0 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.607 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:58.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:58.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.0.asok injectmdataerr csr_pool ROBJ12 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2111: TEST_corrupt_scrub_replicated: inject_eio rep mdata csr_pool ROBJ13 td/osd-scrub-repair 1 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=mdata 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ13 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=1 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ13 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:58.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ13 2026-03-08T23:16:58.677 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ13 2026-03-08T23:16:58.677 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:58.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:58.844 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:58.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=0 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:58.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/0/type 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 0 bluestore_debug_inject_read_err true 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=0 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:58.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.0 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.847 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.848 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:58.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.0.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:58.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:58.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:58.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.0 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:58.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.0.asok 2026-03-08T23:16:58.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:58.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.0.asok injectmdataerr csr_pool ROBJ13 2026-03-08T23:16:58.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2112: TEST_corrupt_scrub_replicated: inject_eio rep data csr_pool ROBJ13 td/osd-scrub-repair 0 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2457: inject_eio: local pooltype=rep 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2458: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2459: inject_eio: local which=data 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2460: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2461: inject_eio: local poolname=csr_pool 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2462: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2463: inject_eio: local objname=ROBJ13 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2464: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2465: inject_eio: local dir=td/osd-scrub-repair 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2466: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2467: inject_eio: local shard_id=0 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2468: inject_eio: shift 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: get_osds csr_pool ROBJ13 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=csr_pool 2026-03-08T23:16:58.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=ROBJ13 2026-03-08T23:16:58.986 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map csr_pool ROBJ13 2026-03-08T23:16:58.986 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: initial_osds=('1' '0') 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2470: inject_eio: local -a initial_osds 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2471: inject_eio: local osd_id=1 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2472: inject_eio: '[' rep '!=' ec ']' 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2473: inject_eio: shard_id= 2026-03-08T23:16:59.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: cat td/osd-scrub-repair/1/type 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2475: inject_eio: type=bluestore 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2476: inject_eio: set_config osd 1 bluestore_debug_inject_read_err true 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1161: set_config: local daemon=osd 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1162: set_config: local id=1 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1163: set_config: local config=bluestore_debug_inject_read_err 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1164: set_config: local value=true 2026-03-08T23:16:59.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: jq 'has("success")' 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: get_asok_path osd.1 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:59.162 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:59.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: env CEPH_ARGS= ceph --format json daemon /tmp/ceph-asok.43024/ceph-osd.1.asok config set bluestore_debug_inject_read_err true 2026-03-08T23:16:59.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1168: set_config: test true == true 2026-03-08T23:16:59.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2477: inject_eio: local loop=0 2026-03-08T23:16:59.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2479: inject_eio: grep -q Invalid 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: get_asok_path osd.1 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-osd.1.asok 2026-03-08T23:16:59.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: CEPH_ARGS= 2026-03-08T23:16:59.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2478: inject_eio: ceph --admin-daemon /tmp/ceph-asok.43024/ceph-osd.1.asok injectdataerr csr_pool ROBJ13 2026-03-08T23:16:59.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2115: TEST_corrupt_scrub_replicated: ceph tell 'osd.*' injectargs -- --osd-max-object-size=134217728 2026-03-08T23:16:59.374 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: {} 2026-03-08T23:16:59.374 INFO:tasks.workunit.client.0.vm03.stderr:osd.0: osd_max_object_size = '' 2026-03-08T23:16:59.383 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: {} 2026-03-08T23:16:59.383 INFO:tasks.workunit.client.0.vm03.stderr:osd.1: osd_max_object_size = '' 2026-03-08T23:16:59.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2117: TEST_corrupt_scrub_replicated: pg_deep_scrub 3.0 2026-03-08T23:16:59.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=3.0 2026-03-08T23:16:59.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 3.0 2026-03-08T23:16:59.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:16:59.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:16:59.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:16:59.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:16:59.729 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:16:59.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:16:59.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:59.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:16:59.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=850403524614 2026-03-08T23:16:59.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 850403524614 2026-03-08T23:16:59.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524614' 2026-03-08T23:16:59.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:16:59.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:16:59.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=871878361092 2026-03-08T23:16:59.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 871878361092 2026-03-08T23:16:59.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524614 1-871878361092' 2026-03-08T23:16:59.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:16:59.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-850403524614 2026-03-08T23:16:59.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:16:59.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:16:59.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-850403524614 2026-03-08T23:16:59.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:16:59.904 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 850403524614 2026-03-08T23:16:59.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=850403524614 2026-03-08T23:16:59.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 850403524614' 2026-03-08T23:16:59.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:00.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524613 -lt 850403524614 2026-03-08T23:17:00.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:01.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:01.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:01.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524615 -lt 850403524614 2026-03-08T23:17:01.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:01.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-871878361092 2026-03-08T23:17:01.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:01.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:01.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-871878361092 2026-03-08T23:17:01.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:01.241 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 871878361092 2026-03-08T23:17:01.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=871878361092 2026-03-08T23:17:01.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 871878361092' 2026-03-08T23:17:01.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 871878361092 -lt 871878361092 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:17:01.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:17:01.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:17:01.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:01.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:17:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 3.0 2026-03-08T23:17:01.811 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0 on osd.1 to deep-scrub 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 3.0 2026-03-08T23:13:16.031237+0000 last_deep_scrub_stamp 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:17:01.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:01.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:17:01.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:01.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:17:01.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:01.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:17:02.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:02.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:17:03.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:03.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:17:03.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:03.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:04.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:17:04.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:13:16.031237+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:04.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:05.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_deep_scrub_stamp 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:05.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_deep_scrub_stamp' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:01.919473+0000 '>' 2026-03-08T23:13:16.031237+0000 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2119: TEST_corrupt_scrub_replicated: err_strings=() 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2120: TEST_corrupt_scrub_replicated: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:30259878:::ROBJ15:head : candidate had a missing info key' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2121: TEST_corrupt_scrub_replicated: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:33aca486:::ROBJ18:head : data_digest 0xbd89c912 != data_digest 0x2ddbf8f5 from auth oi 3:33aca486:::ROBJ18:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 54 dd 2ddbf8f5 od ddc3680f alloc_hint [[]0 0 255[]][)], object info inconsistent ' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2122: TEST_corrupt_scrub_replicated: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:33aca486:::ROBJ18:head : data_digest 0xbd89c912 != data_digest 0x2ddbf8f5 from auth oi 3:33aca486:::ROBJ18:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 54 dd 2ddbf8f5 od ddc3680f alloc_hint [[]0 0 255[]][)]' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2123: TEST_corrupt_scrub_replicated: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:33aca486:::ROBJ18:head : failed to pick suitable auth object' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2124: TEST_corrupt_scrub_replicated: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:5c7b2c47:::ROBJ16:head : candidate had a corrupt snapset' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2125: TEST_corrupt_scrub_replicated: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:5c7b2c47:::ROBJ16:head : candidate had a missing snapset key' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2126: TEST_corrupt_scrub_replicated: err_strings[6]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:5c7b2c47:::ROBJ16:head : failed to pick suitable object info' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2127: TEST_corrupt_scrub_replicated: err_strings[7]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:86586531:::ROBJ8:head : attr value mismatch '\''_key1-ROBJ8'\'', attr name mismatch '\''_key3-ROBJ8'\'', attr name mismatch '\''_key2-ROBJ8'\''' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2128: TEST_corrupt_scrub_replicated: err_strings[8]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:87abbf36:::ROBJ11:head : candidate had a read error' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2129: TEST_corrupt_scrub_replicated: err_strings[9]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:8aa5320e:::ROBJ17:head : data_digest 0x5af0c3ef != data_digest 0x2ddbf8f5 from auth oi 3:8aa5320e:::ROBJ17:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 51 dd 2ddbf8f5 od e9572720 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2130: TEST_corrupt_scrub_replicated: err_strings[10]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:8aa5320e:::ROBJ17:head : data_digest 0x5af0c3ef != data_digest 0x2ddbf8f5 from auth oi 3:8aa5320e:::ROBJ17:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 51 dd 2ddbf8f5 od e9572720 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2131: TEST_corrupt_scrub_replicated: err_strings[11]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:8aa5320e:::ROBJ17:head : failed to pick suitable auth object' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2132: TEST_corrupt_scrub_replicated: err_strings[12]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:8b55fa4b:::ROBJ7:head : omap_digest 0xefced57a != omap_digest 0x6a73cc07 from shard 1' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2133: TEST_corrupt_scrub_replicated: err_strings[13]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:8b55fa4b:::ROBJ7:head : omap_digest 0x6a73cc07 != omap_digest 0xefced57a from auth oi 3:8b55fa4b:::ROBJ7:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 21 dd 2ddbf8f5 od efced57a alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2134: TEST_corrupt_scrub_replicated: err_strings[14]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:a53c12e8:::ROBJ6:head : omap_digest 0x689ee887 != omap_digest 0x179c919f from shard 1, omap_digest 0x689ee887 != omap_digest 0x179c919f from auth oi 3:a53c12e8:::ROBJ6:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 18 dd 2ddbf8f5 od 179c919f alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2135: TEST_corrupt_scrub_replicated: err_strings[15]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:b1f19cbd:::ROBJ10:head : omap_digest 0xa8dd5adc != omap_digest 0xc2025a24 from auth oi 3:b1f19cbd:::ROBJ10:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 30 dd 2ddbf8f5 od c2025a24 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2136: TEST_corrupt_scrub_replicated: err_strings[16]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:b1f19cbd:::ROBJ10:head : omap_digest 0xa8dd5adc != omap_digest 0xc2025a24 from auth oi 3:b1f19cbd:::ROBJ10:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 30 dd 2ddbf8f5 od c2025a24 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2137: TEST_corrupt_scrub_replicated: err_strings[17]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:b1f19cbd:::ROBJ10:head : failed to pick suitable auth object' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2138: TEST_corrupt_scrub_replicated: err_strings[18]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:bc819597:::ROBJ12:head : candidate had a stat error' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2139: TEST_corrupt_scrub_replicated: err_strings[19]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:c0c86b1d:::ROBJ14:head : candidate had a missing info key' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2140: TEST_corrupt_scrub_replicated: err_strings[20]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:c0c86b1d:::ROBJ14:head : candidate had a corrupt info' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2141: TEST_corrupt_scrub_replicated: err_strings[21]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:c0c86b1d:::ROBJ14:head : failed to pick suitable object info' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2142: TEST_corrupt_scrub_replicated: err_strings[22]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : candidate size 9 info size 7 mismatch' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2143: TEST_corrupt_scrub_replicated: err_strings[23]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : data_digest 0x2d4a11c2 != data_digest 0x2ddbf8f5 from shard 0, data_digest 0x2d4a11c2 != data_digest 0x2ddbf8f5 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from shard 0' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2144: TEST_corrupt_scrub_replicated: err_strings[24]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:d60617f9:::ROBJ13:head : candidate had a read error' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2145: TEST_corrupt_scrub_replicated: err_strings[25]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:d60617f9:::ROBJ13:head : candidate had a stat error' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2146: TEST_corrupt_scrub_replicated: err_strings[26]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:d60617f9:::ROBJ13:head : failed to pick suitable object info' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2147: TEST_corrupt_scrub_replicated: err_strings[27]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:e97ce31e:::ROBJ2:head : data_digest 0x578a4830 != data_digest 0x2ddbf8f5 from shard 1, data_digest 0x578a4830 != data_digest 0x2ddbf8f5 from auth oi 3:e97ce31e:::ROBJ2:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 6 dd 2ddbf8f5 od f8e11918 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2148: TEST_corrupt_scrub_replicated: err_strings[28]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 3:f2a5b2a4:::ROBJ3:head : missing' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2149: TEST_corrupt_scrub_replicated: err_strings[29]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:f4981d31:::ROBJ4:head : omap_digest 0xd7178dfe != omap_digest 0xe2d46ea4 from shard 1, omap_digest 0xd7178dfe != omap_digest 0xe2d46ea4 from auth oi 3:f4981d31:::ROBJ4:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 12 dd 2ddbf8f5 od e2d46ea4 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2150: TEST_corrupt_scrub_replicated: err_strings[30]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:f4bfd4d1:::ROBJ5:head : omap_digest 0x1a862a41 != omap_digest 0x6cac8f6 from shard 1' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2151: TEST_corrupt_scrub_replicated: err_strings[31]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:f4bfd4d1:::ROBJ5:head : omap_digest 0x6cac8f6 != omap_digest 0x1a862a41 from auth oi 3:f4bfd4d1:::ROBJ5:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 15 dd 2ddbf8f5 od 1a862a41 alloc_hint [[]0 0 0[]][)]' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2152: TEST_corrupt_scrub_replicated: err_strings[32]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:ffdb2004:::ROBJ9:head : candidate size 3 info size 7 mismatch' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2153: TEST_corrupt_scrub_replicated: err_strings[33]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:ffdb2004:::ROBJ9:head : object info inconsistent ' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2154: TEST_corrupt_scrub_replicated: err_strings[34]='log_channel[(]cluster[)] log [[]ERR[]] : deep-scrub [0-9]*[.]0 3:c0c86b1d:::ROBJ14:head : no '\''_'\'' attr' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2155: TEST_corrupt_scrub_replicated: err_strings[35]='log_channel[(]cluster[)] log [[]ERR[]] : deep-scrub [0-9]*[.]0 3:5c7b2c47:::ROBJ16:head : can'\''t decode '\''snapset'\'' attr .* v=3 cannot decode .* Malformed input' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2156: TEST_corrupt_scrub_replicated: err_strings[36]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub : stat mismatch, got 19/19 objects, 0/0 clones, 18/19 dirty, 18/19 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 1049715/1049716 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2157: TEST_corrupt_scrub_replicated: err_strings[37]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub 1 missing, 11 inconsistent objects' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2158: TEST_corrupt_scrub_replicated: err_strings[38]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub 35 errors' 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:30259878:::ROBJ15:head : candidate had a missing info key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:33aca486:::ROBJ18:head : data_digest 0xbd89c912 != data_digest 0x2ddbf8f5 from auth oi 3:33aca486:::ROBJ18:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 54 dd 2ddbf8f5 od ddc3680f alloc_hint [[]0 0 255[]][)], object info inconsistent ' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:33aca486:::ROBJ18:head : data_digest 0xbd89c912 != data_digest 0x2ddbf8f5 from auth oi 3:33aca486:::ROBJ18:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 54 dd 2ddbf8f5 od ddc3680f alloc_hint [[]0 0 255[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:33aca486:::ROBJ18:head : failed to pick suitable auth object' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:5c7b2c47:::ROBJ16:head : candidate had a corrupt snapset' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:5c7b2c47:::ROBJ16:head : candidate had a missing snapset key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:5c7b2c47:::ROBJ16:head : failed to pick suitable object info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:86586531:::ROBJ8:head : attr value mismatch '\''_key1-ROBJ8'\'', attr name mismatch '\''_key3-ROBJ8'\'', attr name mismatch '\''_key2-ROBJ8'\''' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:87abbf36:::ROBJ11:head : candidate had a read error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:8aa5320e:::ROBJ17:head : data_digest 0x5af0c3ef != data_digest 0x2ddbf8f5 from auth oi 3:8aa5320e:::ROBJ17:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 51 dd 2ddbf8f5 od e9572720 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:8aa5320e:::ROBJ17:head : data_digest 0x5af0c3ef != data_digest 0x2ddbf8f5 from auth oi 3:8aa5320e:::ROBJ17:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 51 dd 2ddbf8f5 od e9572720 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:8aa5320e:::ROBJ17:head : failed to pick suitable auth object' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:8b55fa4b:::ROBJ7:head : omap_digest 0xefced57a != omap_digest 0x6a73cc07 from shard 1' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:8b55fa4b:::ROBJ7:head : omap_digest 0x6a73cc07 != omap_digest 0xefced57a from auth oi 3:8b55fa4b:::ROBJ7:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 21 dd 2ddbf8f5 od efced57a alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:a53c12e8:::ROBJ6:head : omap_digest 0x689ee887 != omap_digest 0x179c919f from shard 1, omap_digest 0x689ee887 != omap_digest 0x179c919f from auth oi 3:a53c12e8:::ROBJ6:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 18 dd 2ddbf8f5 od 179c919f alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:b1f19cbd:::ROBJ10:head : omap_digest 0xa8dd5adc != omap_digest 0xc2025a24 from auth oi 3:b1f19cbd:::ROBJ10:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 30 dd 2ddbf8f5 od c2025a24 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:b1f19cbd:::ROBJ10:head : omap_digest 0xa8dd5adc != omap_digest 0xc2025a24 from auth oi 3:b1f19cbd:::ROBJ10:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 30 dd 2ddbf8f5 od c2025a24 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:b1f19cbd:::ROBJ10:head : failed to pick suitable auth object' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:bc819597:::ROBJ12:head : candidate had a stat error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:c0c86b1d:::ROBJ14:head : candidate had a missing info key' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:c0c86b1d:::ROBJ14:head : candidate had a corrupt info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:c0c86b1d:::ROBJ14:head : failed to pick suitable object info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : candidate size 9 info size 7 mismatch' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:ce3f1d6a:::ROBJ1:head : data_digest 0x2d4a11c2 != data_digest 0x2ddbf8f5 from shard 0, data_digest 0x2d4a11c2 != data_digest 0x2ddbf8f5 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from auth oi 3:ce3f1d6a:::ROBJ1:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 3 dd 2ddbf8f5 od f5fba2c6 alloc_hint [[]0 0 0[]][)], size 9 != size 7 from shard 0' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:d60617f9:::ROBJ13:head : candidate had a read error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:d60617f9:::ROBJ13:head : candidate had a stat error' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:d60617f9:::ROBJ13:head : failed to pick suitable object info' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:e97ce31e:::ROBJ2:head : data_digest 0x578a4830 != data_digest 0x2ddbf8f5 from shard 1, data_digest 0x578a4830 != data_digest 0x2ddbf8f5 from auth oi 3:e97ce31e:::ROBJ2:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 6 dd 2ddbf8f5 od f8e11918 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 3:f2a5b2a4:::ROBJ3:head : missing' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:f4981d31:::ROBJ4:head : omap_digest 0xd7178dfe != omap_digest 0xe2d46ea4 from shard 1, omap_digest 0xd7178dfe != omap_digest 0xe2d46ea4 from auth oi 3:f4981d31:::ROBJ4:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 12 dd 2ddbf8f5 od e2d46ea4 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid 3:f4bfd4d1:::ROBJ5:head : omap_digest 0x1a862a41 != omap_digest 0x6cac8f6 from shard 1' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 1 soid 3:f4bfd4d1:::ROBJ5:head : omap_digest 0x6cac8f6 != omap_digest 0x1a862a41 from auth oi 3:f4bfd4d1:::ROBJ5:head[(][0-9]*'\''[0-9]* osd.1.0:[0-9]* dirty|omap|data_digest|omap_digest s 7 uv 15 dd 2ddbf8f5 od 1a862a41 alloc_hint [[]0 0 0[]][)]' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:ffdb2004:::ROBJ9:head : candidate size 3 info size 7 mismatch' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard 0 soid 3:ffdb2004:::ROBJ9:head : object info inconsistent ' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : deep-scrub [0-9]*[.]0 3:c0c86b1d:::ROBJ14:head : no '\''_'\'' attr' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : deep-scrub [0-9]*[.]0 3:5c7b2c47:::ROBJ16:head : can'\''t decode '\''snapset'\'' attr .* v=3 cannot decode .* Malformed input' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub : stat mismatch, got 19/19 objects, 0/0 clones, 18/19 dirty, 18/19 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 1049715/1049716 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub 1 missing, 11 inconsistent objects' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2160: TEST_corrupt_scrub_replicated: for err_string in "${err_strings[@]}" 2026-03-08T23:17:05.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2162: TEST_corrupt_scrub_replicated: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 deep-scrub 35 errors' td/osd-scrub-repair/osd.1.log 2026-03-08T23:17:05.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2169: TEST_corrupt_scrub_replicated: rados list-inconsistent-pg csr_pool 2026-03-08T23:17:05.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2171: TEST_corrupt_scrub_replicated: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:17:05.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2171: TEST_corrupt_scrub_replicated: test 1 = 1 2026-03-08T23:17:05.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2173: TEST_corrupt_scrub_replicated: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:17:05.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2173: TEST_corrupt_scrub_replicated: test 3.0 = 3.0 2026-03-08T23:17:05.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2175: TEST_corrupt_scrub_replicated: rados list-inconsistent-obj 3.0 2026-03-08T23:17:05.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2177: TEST_corrupt_scrub_replicated: jq .epoch td/osd-scrub-repair/json 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2177: TEST_corrupt_scrub_replicated: epoch=203 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2179: TEST_corrupt_scrub_replicated: jq 'def walk(f): 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:17:05.806 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:17:05.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2179: TEST_corrupt_scrub_replicated: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:17:05.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:2179: TEST_corrupt_scrub_replicated: jq .inconsistents 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3558: TEST_corrupt_scrub_replicated: jq 'def walk(f): 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:17:05.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3558: TEST_corrupt_scrub_replicated: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:17:05.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3558: TEST_corrupt_scrub_replicated: jq .inconsistents 2026-03-08T23:17:05.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3559: TEST_corrupt_scrub_replicated: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:17:05.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:17:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3560: TEST_corrupt_scrub_replicated: test no = yes 2026-03-08T23:17:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3565: TEST_corrupt_scrub_replicated: test '' = yes 2026-03-08T23:17:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3570: TEST_corrupt_scrub_replicated: repair 3.0 2026-03-08T23:17:05.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=3.0 2026-03-08T23:17:05.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 3.0 2026-03-08T23:17:05.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:05.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:05.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:05.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:06.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:06.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 3.0 2026-03-08T23:17:06.191 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0 on osd.1 to repair 2026-03-08T23:17:06.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 3.0 2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:06.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:06.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:01.919473+0000 '>' 2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:06.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:07.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:07.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:07.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:01.919473+0000 '>' 2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:07.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:08.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:06.959735+0000 '>' 2026-03-08T23:17:01.919473+0000 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3571: TEST_corrupt_scrub_replicated: wait_for_clean 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:17:08.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:08.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:08.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:08.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:08.967 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:08.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:08.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:08.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=850403524617 2026-03-08T23:17:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 850403524617 2026-03-08T23:17:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524617' 2026-03-08T23:17:09.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:09.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:09.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=871878361095 2026-03-08T23:17:09.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 871878361095 2026-03-08T23:17:09.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-850403524617 1-871878361095' 2026-03-08T23:17:09.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:09.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-850403524617 2026-03-08T23:17:09.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-850403524617 2026-03-08T23:17:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:09.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=850403524617 2026-03-08T23:17:09.134 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 850403524617 2026-03-08T23:17:09.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 850403524617' 2026-03-08T23:17:09.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:09.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524616 -lt 850403524617 2026-03-08T23:17:09.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:10.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:10.302 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:10.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 850403524618 -lt 850403524617 2026-03-08T23:17:10.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:10.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-871878361095 2026-03-08T23:17:10.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:10.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:10.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-871878361095 2026-03-08T23:17:10.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:10.482 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 871878361095 2026-03-08T23:17:10.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=871878361095 2026-03-08T23:17:10.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 871878361095' 2026-03-08T23:17:10.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:10.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 871878361095 -lt 871878361095 2026-03-08T23:17:10.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:17:10.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:10.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:10.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:17:10.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:17:10.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:17:11.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:17:11.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:17:11.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:11.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:11.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:17:11.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:17:11.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:17:11.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3574: TEST_corrupt_scrub_replicated: timeout 30 rados -p csr_pool get ROBJ17 td/osd-scrub-repair/robj17.out 2026-03-08T23:17:11.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3575: TEST_corrupt_scrub_replicated: timeout 30 rados -p csr_pool get ROBJ18 td/osd-scrub-repair/robj18.out 2026-03-08T23:17:11.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3577: TEST_corrupt_scrub_replicated: diff -q td/osd-scrub-repair/new.ROBJ17 td/osd-scrub-repair/robj17.out 2026-03-08T23:17:11.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3578: TEST_corrupt_scrub_replicated: rm -f td/osd-scrub-repair/new.ROBJ17 td/osd-scrub-repair/robj17.out 2026-03-08T23:17:11.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3579: TEST_corrupt_scrub_replicated: diff -q td/osd-scrub-repair/new.ROBJ18 td/osd-scrub-repair/robj18.out 2026-03-08T23:17:11.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3580: TEST_corrupt_scrub_replicated: rm -f td/osd-scrub-repair/new.ROBJ18 td/osd-scrub-repair/robj18.out 2026-03-08T23:17:11.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3582: TEST_corrupt_scrub_replicated: '[' 0 '!=' 0 ']' 2026-03-08T23:17:11.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:3588: TEST_corrupt_scrub_replicated: ceph osd pool rm csr_pool csr_pool --yes-i-really-really-mean-it 2026-03-08T23:17:11.501 INFO:tasks.workunit.client.0.vm03.stderr:pool 'csr_pool' removed 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:17:11.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:17:11.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:17:11.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:17:11.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:17:11.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:17:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:17:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:17:11.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:17:11.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:17:11.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:17:11.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:17:11.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:17:11.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:17:11.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:17:11.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:17:11.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:17:11.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:17:11.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:17:11.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:17:11.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:17:11.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:17:11.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:17:11.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:17:11.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:17:11.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:17:11.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:17:11.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:17:11.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:17:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:17:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:17:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:17:11.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:17:11.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:17:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:17:11.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:17:11.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:17:11.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:17:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:17:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:17:11.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:17:11.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:17:11.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:17:11.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_corrupt_snapset_scrub_rep td/osd-scrub-repair 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5931: TEST_corrupt_snapset_scrub_rep: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5932: TEST_corrupt_snapset_scrub_rep: local poolname=csr_pool 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5933: TEST_corrupt_snapset_scrub_rep: local total_objs=2 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5935: TEST_corrupt_snapset_scrub_rep: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:17:11.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:11.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:17:11.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:17:11.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:17:11.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:17:11.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:17:11.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:17:11.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:17:11.757 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:17:11.757 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:17:11.757 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:17:11.761 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:17:11.761 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.761 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.763 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:17:11.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:17:11.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:11.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:17:11.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:17:11.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5936: TEST_corrupt_snapset_scrub_rep: run_mgr td/osd-scrub-repair x 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:17:11.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:12.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:17:12.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5937: TEST_corrupt_snapset_scrub_rep: run_osd td/osd-scrub-repair 0 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:12.021 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:12.024 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:17:12.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:17:12.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:17:12.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=565c29a0-3f90-4fa6-afbf-42a75c098dc2 2026-03-08T23:17:12.027 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 565c29a0-3f90-4fa6-afbf-42a75c098dc2 2026-03-08T23:17:12.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 565c29a0-3f90-4fa6-afbf-42a75c098dc2' 2026-03-08T23:17:12.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:17:12.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB4A65p8fy+AhAAT3EI5HVa502T3V/5Scy3OQ== 2026-03-08T23:17:12.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB4A65p8fy+AhAAT3EI5HVa502T3V/5Scy3OQ=="}' 2026-03-08T23:17:12.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 565c29a0-3f90-4fa6-afbf-42a75c098dc2 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:17:12.144 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:17:12.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:17:12.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB4A65p8fy+AhAAT3EI5HVa502T3V/5Scy3OQ== --osd-uuid 565c29a0-3f90-4fa6-afbf-42a75c098dc2 2026-03-08T23:17:12.179 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:12.183+0000 7fa868db78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:12.181 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:12.187+0000 7fa868db78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:12.183 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:12.187+0000 7fa868db78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:12.183 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:12.187+0000 7fa868db78c0 -1 bdev(0x5637accd2c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:17:12.183 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:12.187+0000 7fa868db78c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:17:14.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:17:14.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:17:14.447 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:17:14.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:17:14.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:17:14.557 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:17:14.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:17:14.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:17:14.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:17:14.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:17:14.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:17:14.613 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:14.611+0000 7fcf66d0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:14.626 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:14.631+0000 7fcf66d0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:14.641 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:14.639+0000 7fcf66d0f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:14.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:15.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:15.095+0000 7fcf66d0f8c0 -1 Falling back to public interface 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:15.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:16.542 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:16.547+0000 7fcf66d0f8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:17:17.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:17.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:17.084 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:17:17.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:17:17.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:17.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:17.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:17.659 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:17.663+0000 7fcf624c8640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:17:18.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:18.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:18.278 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:17:18.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:17:18.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:18.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:18.452 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/797005012,v1:127.0.0.1:6803/797005012] [v2:127.0.0.1:6804/797005012,v1:127.0.0.1:6805/797005012] exists,up 565c29a0-3f90-4fa6-afbf-42a75c098dc2 2026-03-08T23:17:18.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:17:18.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:17:18.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:17:18.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5938: TEST_corrupt_snapset_scrub_rep: run_osd td/osd-scrub-repair 1 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:17:18.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:17:18.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:17:18.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 2026-03-08T23:17:18.455 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 2026-03-08T23:17:18.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8' 2026-03-08T23:17:18.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:17:18.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB+A65pwLQzHBAA2Au/LPFJMxjHRcyFVD1jFA== 2026-03-08T23:17:18.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB+A65pwLQzHBAA2Au/LPFJMxjHRcyFVD1jFA=="}' 2026-03-08T23:17:18.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:17:18.631 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:17:18.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:17:18.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB+A65pwLQzHBAA2Au/LPFJMxjHRcyFVD1jFA== --osd-uuid f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 2026-03-08T23:17:18.662 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:18.667+0000 7fa26a6728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:18.666 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:18.667+0000 7fa26a6728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:18.667 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:18.671+0000 7fa26a6728c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:18.668 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:18.671+0000 7fa26a6728c0 -1 bdev(0x55e21d2c9c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:17:18.668 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:18.671+0000 7fa26a6728c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:17:20.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:17:20.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:17:20.931 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:17:20.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:17:20.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:17:21.137 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:17:21.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:17:21.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:17:21.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:17:21.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:17:21.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:17:21.156 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:21.159+0000 7f3b1da618c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:21.160 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:21.163+0000 7f3b1da618c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:21.169 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:21.167+0000 7f3b1da618c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:21.317 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:21.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:21.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:22.362 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:22.367+0000 7f3b1da618c0 -1 Falling back to public interface 2026-03-08T23:17:22.486 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:17:22.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:22.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:17:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:22.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:22.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:23.348 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:23.351+0000 7f3b1da618c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:23.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:24.481 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:24.483+0000 7f3b1921a640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:17:24.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:24.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:24.862 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:17:24.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:17:24.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:24.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1079627380,v1:127.0.0.1:6811/1079627380] [v2:127.0.0.1:6812/1079627380,v1:127.0.0.1:6813/1079627380] exists,up f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5939: TEST_corrupt_snapset_scrub_rep: create_rbd_pool 2026-03-08T23:17:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:17:25.215 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:17:25.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:17:25.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:17:25.485 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:17:25.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:17:26.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:17:26.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5940: TEST_corrupt_snapset_scrub_rep: wait_for_clean 2026-03-08T23:17:26.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:26.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:26.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:27.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:27.053 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:27.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:27.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:27.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:27.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:17:27.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:17:27.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:17:27.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:27.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:27.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:17:27.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:17:27.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:17:27.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:27.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:17:27.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:27.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:27.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:17:27.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:27.227 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:17:27.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:17:27.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:17:27.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:27.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:17:27.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:28.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:28.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:28.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:17:28.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:29.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:17:29.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:29.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836483 2026-03-08T23:17:29.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:29.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:17:29.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:29.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:29.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:17:29.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:29.759 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:17:29.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:17:29.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:17:29.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:17:29.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:17:29.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:29.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:30.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:17:30.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:17:30.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:17:30.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:17:30.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:17:30.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:30.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:17:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:17:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:17:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5942: TEST_corrupt_snapset_scrub_rep: create_pool foo 1 2026-03-08T23:17:30.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create foo 1 2026-03-08T23:17:30.784 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' created 2026-03-08T23:17:30.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:17:31.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5943: TEST_corrupt_snapset_scrub_rep: create_pool csr_pool 1 1 2026-03-08T23:17:31.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create csr_pool 1 1 2026-03-08T23:17:32.020 INFO:tasks.workunit.client.0.vm03.stderr:pool 'csr_pool' created 2026-03-08T23:17:32.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5944: TEST_corrupt_snapset_scrub_rep: wait_for_clean 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:33.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:33.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:33.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:33.280 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:33.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:33.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:33.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:17:33.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:17:33.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:17:33.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:33.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:33.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T23:17:33.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T23:17:33.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T23:17:33.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:33.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:17:33.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:33.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:33.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:17:33.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:33.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:17:33.446 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:17:33.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:17:33.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:33.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:17:33.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:34.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:34.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:34.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836485 2026-03-08T23:17:34.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:34.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T23:17:34.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:34.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:34.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T23:17:34.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:34.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T23:17:34.806 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T23:17:34.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T23:17:34.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:34.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T23:17:34.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:17:34.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:34.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:35.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:17:35.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:17:35.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:17:35.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:17:35.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:17:35.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:35.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:35.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:17:35.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:17:35.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:17:35.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5946: TEST_corrupt_snapset_scrub_rep: seq 1 2 2026-03-08T23:17:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5946: TEST_corrupt_snapset_scrub_rep: for i in $(seq 1 $total_objs) 2026-03-08T23:17:35.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5947: TEST_corrupt_snapset_scrub_rep: objname=ROBJ1 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5948: TEST_corrupt_snapset_scrub_rep: add_something td/osd-scrub-repair csr_pool ROBJ1 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ1 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:17:35.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:17:35.760 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:17:35.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:17:35.970 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:17:35.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:17:35.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:17:35.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:17:36.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5950: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool setomapheader ROBJ1 hdr-ROBJ1 2026-03-08T23:17:36.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5951: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool setomapval ROBJ1 key-ROBJ1 val-ROBJ1 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5946: TEST_corrupt_snapset_scrub_rep: for i in $(seq 1 $total_objs) 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5947: TEST_corrupt_snapset_scrub_rep: objname=ROBJ2 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5948: TEST_corrupt_snapset_scrub_rep: add_something td/osd-scrub-repair csr_pool ROBJ2 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=csr_pool 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=ROBJ2 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:17:36.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:17:36.262 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:17:36.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:17:36.473 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:17:36.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:17:36.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:17:36.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool csr_pool put ROBJ2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:17:36.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5950: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool setomapheader ROBJ2 hdr-ROBJ2 2026-03-08T23:17:36.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5951: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool setomapval ROBJ2 key-ROBJ2 val-ROBJ2 2026-03-08T23:17:36.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5954: TEST_corrupt_snapset_scrub_rep: get_pg csr_pool ROBJ0 2026-03-08T23:17:36.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=csr_pool 2026-03-08T23:17:36.562 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=ROBJ0 2026-03-08T23:17:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map csr_pool ROBJ0 2026-03-08T23:17:36.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5954: TEST_corrupt_snapset_scrub_rep: local pg=3.0 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5955: TEST_corrupt_snapset_scrub_rep: get_primary csr_pool ROBJ0 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=csr_pool 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=ROBJ0 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map csr_pool ROBJ0 2026-03-08T23:17:36.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:17:36.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5955: TEST_corrupt_snapset_scrub_rep: local primary=1 2026-03-08T23:17:36.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5957: TEST_corrupt_snapset_scrub_rep: rados -p csr_pool mksnap snap1 2026-03-08T23:17:36.997 INFO:tasks.workunit.client.0.vm03.stdout:created pool csr_pool snap snap1 2026-03-08T23:17:37.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5958: TEST_corrupt_snapset_scrub_rep: echo -n head_of_snapshot_data 2026-03-08T23:17:37.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5960: TEST_corrupt_snapset_scrub_rep: seq 1 2 2026-03-08T23:17:37.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5960: TEST_corrupt_snapset_scrub_rep: for i in $(seq 1 $total_objs) 2026-03-08T23:17:37.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5961: TEST_corrupt_snapset_scrub_rep: objname=ROBJ1 2026-03-08T23:17:37.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5964: TEST_corrupt_snapset_scrub_rep: expr 1 % 2 2026-03-08T23:17:37.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5964: TEST_corrupt_snapset_scrub_rep: local osd=1 2026-03-08T23:17:37.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5966: TEST_corrupt_snapset_scrub_rep: case $i in 2026-03-08T23:17:37.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5968: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool put ROBJ1 td/osd-scrub-repair/change 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5969: TEST_corrupt_snapset_scrub_rep: objectstore_tool td/osd-scrub-repair 1 --head ROBJ1 clear-snapset corrupt 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 1 --head ROBJ1 clear-snapset corrupt 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:17:37.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:17:37.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:17:37.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 1 --head ROBJ1 clear-snapset corrupt 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:17:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 --head ROBJ1 clear-snapset corrupt 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 1 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:17:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:38.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:17:38.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:17:38.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:17:38.531 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:17:38.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:17:38.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:17:38.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:17:38.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:17:38.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:17:38.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:17:38.548 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:38.551+0000 7f54bc0148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:38.548 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:38.551+0000 7f54bc0148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:38.550 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:38.555+0000 7f54bc0148c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:38.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:38.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:39.765 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:39.771+0000 7f54bc0148c0 -1 Falling back to public interface 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:39.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:40.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:40.743 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:40.747+0000 7f54bc0148c0 -1 osd.1 29 log_to_monitors true 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:41.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:41.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:41.765 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:41.771+0000 7f54b2fc4640 -1 osd.1 29 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:42.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:17:42.416 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 33 up_thru 33 down_at 30 last_clean_interval [10,29) [v2:127.0.0.1:6810/197936534,v1:127.0.0.1:6811/197936534] [v2:127.0.0.1:6812/197936534,v1:127.0.0.1:6813/197936534] exists,up f8609885-ec7f-4ad1-9bd4-e7fd14d27ff8 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:17:42.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:17:42.418 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:42.418 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:42.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:42.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:42.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:42.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:42.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:42.662 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:42.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:42.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:42.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:42.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836488 2026-03-08T23:17:42.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836488 2026-03-08T23:17:42.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488' 2026-03-08T23:17:42.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:42.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:42.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920770 2026-03-08T23:17:42.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920770 2026-03-08T23:17:42.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836488 1-141733920770' 2026-03-08T23:17:42.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:42.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836488 2026-03-08T23:17:42.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:42.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:42.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836488 2026-03-08T23:17:42.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:42.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836488 2026-03-08T23:17:42.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836488' 2026-03-08T23:17:42.831 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836488 2026-03-08T23:17:42.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:43.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836488 2026-03-08T23:17:43.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:44.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:44.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:44.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836488 2026-03-08T23:17:44.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:45.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:17:45.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:45.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836488 2026-03-08T23:17:45.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:45.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-141733920770 2026-03-08T23:17:45.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:45.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:45.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-141733920770 2026-03-08T23:17:45.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:45.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920770 2026-03-08T23:17:45.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 141733920770' 2026-03-08T23:17:45.347 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 141733920770 2026-03-08T23:17:45.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:45.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920770 -lt 141733920770 2026-03-08T23:17:45.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:17:45.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:45.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:17:45.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:17:45.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:17:45.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:17:45.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:17:45.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:17:45.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:45.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:46.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:17:46.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:17:46.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:17:46.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5960: TEST_corrupt_snapset_scrub_rep: for i in $(seq 1 $total_objs) 2026-03-08T23:17:46.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5961: TEST_corrupt_snapset_scrub_rep: objname=ROBJ2 2026-03-08T23:17:46.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5964: TEST_corrupt_snapset_scrub_rep: expr 2 % 2 2026-03-08T23:17:46.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5964: TEST_corrupt_snapset_scrub_rep: local osd=0 2026-03-08T23:17:46.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5966: TEST_corrupt_snapset_scrub_rep: case $i in 2026-03-08T23:17:46.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5973: TEST_corrupt_snapset_scrub_rep: rados --pool csr_pool put ROBJ2 td/osd-scrub-repair/change 2026-03-08T23:17:46.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5974: TEST_corrupt_snapset_scrub_rep: objectstore_tool td/osd-scrub-repair 0 --head ROBJ2 clear-snapset corrupt 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 --head ROBJ2 clear-snapset corrupt 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:17:46.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 --head ROBJ2 clear-snapset corrupt 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:17:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 --head ROBJ2 clear-snapset corrupt 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:17:47.417 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:17:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:17:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:17:47.419 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:17:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:17:47.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:17:47.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:17:47.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:17:47.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:17:47.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:17:47.436 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:47.439+0000 7fdd353328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:47.445 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:47.451+0000 7fdd353328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:47.446 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:47.451+0000 7fdd353328c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:47.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:47.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:48.633 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:48.639+0000 7fdd353328c0 -1 Falling back to public interface 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:48.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:48.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:49.622 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:17:49.627+0000 7fdd353328c0 -1 osd.0 35 log_to_monitors true 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:49.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:50.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:17:51.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 39 up_thru 39 down_at 36 last_clean_interval [5,35) [v2:127.0.0.1:6802/3034056259,v1:127.0.0.1:6803/3034056259] [v2:127.0.0.1:6804/3034056259,v1:127.0.0.1:6805/3034056259] exists,up 565c29a0-3f90-4fa6-afbf-42a75c098dc2 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:17:51.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:17:51.300 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:51.300 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:51.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:51.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:51.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:51.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:51.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:51.535 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:51.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:51.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:51.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:51.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724546 2026-03-08T23:17:51.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724546 2026-03-08T23:17:51.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724546' 2026-03-08T23:17:51.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:51.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920773 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920773 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724546 1-141733920773' 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-167503724546 2026-03-08T23:17:51.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:51.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:51.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-167503724546 2026-03-08T23:17:51.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:51.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724546 2026-03-08T23:17:51.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 167503724546' 2026-03-08T23:17:51.693 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 167503724546 2026-03-08T23:17:51.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:51.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 167503724546 2026-03-08T23:17:51.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:52.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:52.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:53.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724546 -lt 167503724546 2026-03-08T23:17:53.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:53.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:53.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-141733920773 2026-03-08T23:17:53.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:53.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-141733920773 2026-03-08T23:17:53.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:53.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920773 2026-03-08T23:17:53.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 141733920773' 2026-03-08T23:17:53.029 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 141733920773 2026-03-08T23:17:53.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:53.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920773 -lt 141733920773 2026-03-08T23:17:53.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:17:53.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:53.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:17:53.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:17:53.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:17:53.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:17:53.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:17:53.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:17:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:17:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:17:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:17:53.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5979: TEST_corrupt_snapset_scrub_rep: rm td/osd-scrub-repair/change 2026-03-08T23:17:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5981: TEST_corrupt_snapset_scrub_rep: pg_scrub 3.0 2026-03-08T23:17:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=3.0 2026-03-08T23:17:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 3.0 2026-03-08T23:17:53.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=3.0 2026-03-08T23:17:53.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:17:53.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:17:53.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:17:53.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:17:53.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:17:53.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:17:53.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:17:54.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:17:54.114 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:17:54.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:17:54.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:54.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:17:54.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=167503724547 2026-03-08T23:17:54.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 167503724547 2026-03-08T23:17:54.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724547' 2026-03-08T23:17:54.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:17:54.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:17:54.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920774 2026-03-08T23:17:54.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920774 2026-03-08T23:17:54.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-167503724547 1-141733920774' 2026-03-08T23:17:54.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:54.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-167503724547 2026-03-08T23:17:54.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:54.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:17:54.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-167503724547 2026-03-08T23:17:54.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:54.290 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 167503724547 2026-03-08T23:17:54.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=167503724547 2026-03-08T23:17:54.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 167503724547' 2026-03-08T23:17:54.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:54.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724546 -lt 167503724547 2026-03-08T23:17:54.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:17:55.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:17:55.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:17:55.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 167503724547 -lt 167503724547 2026-03-08T23:17:55.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:17:55.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-141733920774 2026-03-08T23:17:55.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:17:55.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:17:55.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-141733920774 2026-03-08T23:17:55.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:17:55.632 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 141733920774 2026-03-08T23:17:55.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920774 2026-03-08T23:17:55.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 141733920774' 2026-03-08T23:17:55.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 3.0 loop 0 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920774 -lt 141733920774 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 3.0 loop 0' 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 3.0 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=3.0 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 3.0 query 2026-03-08T23:17:55.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:17:55.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:17:55.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:17:55.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:17:55.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:17:55.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 3.0 2026-03-08T23:17:55.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:55.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:55.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:55.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:56.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:56.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 3.0 2026-03-08T23:17:56.235 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 3.0 on osd.1 to scrub 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 3.0 2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=3.0 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:56.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:56.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:56.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:56.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:56.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:57.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:57.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:57.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:57.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:57.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:57.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:57.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:57.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:57.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:58.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:58.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:58.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:17:59.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:17:59.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:17:59.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:17:59.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:17:59.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:17:59.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:17:59.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:17:59.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:17:59.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:18:00.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:18:00.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:00.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:18:00.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:18:00.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:00.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:00.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:18:01.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:18:01.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:18:02.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:18:02.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:02.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:18:02.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:18:02.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:02.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:02.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:18:02.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:32.024084+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:18:02.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 3.0 last_scrub_stamp 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=3.0 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:03.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="3.0") | .last_scrub_stamp' 2026-03-08T23:18:03.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:17:56.712947+0000 '>' 2026-03-08T23:17:32.024084+0000 2026-03-08T23:18:03.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:18:03.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5983: TEST_corrupt_snapset_scrub_rep: rados list-inconsistent-pg csr_pool 2026-03-08T23:18:03.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5985: TEST_corrupt_snapset_scrub_rep: jq '. | length' td/osd-scrub-repair/json 2026-03-08T23:18:03.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5985: TEST_corrupt_snapset_scrub_rep: test 1 = 1 2026-03-08T23:18:03.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5987: TEST_corrupt_snapset_scrub_rep: jq -r '.[0]' td/osd-scrub-repair/json 2026-03-08T23:18:03.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5987: TEST_corrupt_snapset_scrub_rep: test 3.0 = 3.0 2026-03-08T23:18:03.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5989: TEST_corrupt_snapset_scrub_rep: rados list-inconsistent-obj 3.0 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5991: TEST_corrupt_snapset_scrub_rep: jq 'def walk(f): 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' 2026-03-08T23:18:03.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5991: TEST_corrupt_snapset_scrub_rep: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:18:03.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5991: TEST_corrupt_snapset_scrub_rep: jq .inconsistents 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6157: TEST_corrupt_snapset_scrub_rep: jq 'def walk(f): 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: . as $in 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: | if type == "object" then 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: reduce keys[] as $key 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: ( {}; . + { ($key): ($in[$key] | walk(f)) } ) | f 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: elif type == "array" then map( walk(f) ) | f 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: else f 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr: end; 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:walk(if type == "object" then del(.mtime) else . end) 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.local_mtime) else . end) 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.last_reqid) else . end) 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.version) else . end) 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:| walk(if type == "object" then del(.prior_version) else . end)' td/osd-scrub-repair/json 2026-03-08T23:18:03.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6157: TEST_corrupt_snapset_scrub_rep: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print(json.dumps(ud, sort_keys=True, indent=2))' 2026-03-08T23:18:03.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6157: TEST_corrupt_snapset_scrub_rep: jq .inconsistents 2026-03-08T23:18:03.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6158: TEST_corrupt_snapset_scrub_rep: multidiff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:18:03.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-repair/checkcsjson td/osd-scrub-repair/csjson 2026-03-08T23:18:03.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6159: TEST_corrupt_snapset_scrub_rep: test no = yes 2026-03-08T23:18:03.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6164: TEST_corrupt_snapset_scrub_rep: test '' = yes 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6169: TEST_corrupt_snapset_scrub_rep: ERRORS=0 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6170: TEST_corrupt_snapset_scrub_rep: declare -a err_strings 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6171: TEST_corrupt_snapset_scrub_rep: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid [0-9]*:.*:::ROBJ1:head : snapset inconsistent' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6172: TEST_corrupt_snapset_scrub_rep: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid [0-9]*:.*:::ROBJ2:head : snapset inconsistent' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6173: TEST_corrupt_snapset_scrub_rep: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 [0-9]*:.*:::ROBJ1:1 : is an unexpected clone' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6174: TEST_corrupt_snapset_scrub_rep: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub : stat mismatch, got 3/4 objects, 1/2 clones, 3/4 dirty, 3/4 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 49/56 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6175: TEST_corrupt_snapset_scrub_rep: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 0 missing, 2 inconsistent objects' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6176: TEST_corrupt_snapset_scrub_rep: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 4 errors' 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid [0-9]*:.*:::ROBJ1:head : snapset inconsistent' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid [0-9]*:.*:::ROBJ2:head : snapset inconsistent' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 [0-9]*:.*:::ROBJ1:1 : is an unexpected clone' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub : stat mismatch, got 3/4 objects, 1/2 clones, 3/4 dirty, 3/4 omap, 0/0 pinned, 0/0 hit_set_archive, 0/0 whiteouts, 49/56 bytes, 0/0 manifest objects, 0/0 hit_set_archive bytes.' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 0 missing, 2 inconsistent objects' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6178: TEST_corrupt_snapset_scrub_rep: for err_string in "${err_strings[@]}" 2026-03-08T23:18:03.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6180: TEST_corrupt_snapset_scrub_rep: grep -q 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 4 errors' td/osd-scrub-repair/osd.1.log 2026-03-08T23:18:03.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6187: TEST_corrupt_snapset_scrub_rep: '[' 0 '!=' 0 ']' 2026-03-08T23:18:03.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6193: TEST_corrupt_snapset_scrub_rep: ceph osd pool rm csr_pool csr_pool --yes-i-really-really-mean-it 2026-03-08T23:18:03.748 INFO:tasks.workunit.client.0.vm03.stderr:pool 'csr_pool' removed 2026-03-08T23:18:03.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:18:03.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:18:03.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:18:03.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:18:03.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:18:03.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:18:03.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:18:03.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:18:03.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:18:03.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:18:03.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:18:03.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:18:03.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:18:03.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:18:03.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:18:03.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:18:03.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:18:03.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:03.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:03.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:18:03.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:18:03.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:18:03.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:18:03.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:18:03.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:18:03.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:18:03.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:18:03.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:18:03.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:18:03.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:18:03.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:18:03.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:18:03.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:18:03.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:18:03.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:18:03.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:18:03.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:18:03.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:18:03.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:18:03.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:18:03.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:03.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:03.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:18:03.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:18:03.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:18:03.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:18:03.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:18:03.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:03.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:03.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:18:03.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_list_missing_erasure_coded_appends td/osd-scrub-repair 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:978: TEST_list_missing_erasure_coded_appends: list_missing_erasure_coded td/osd-scrub-repair false 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:916: list_missing_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:917: list_missing_erasure_coded: local allow_overwrites=false 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:918: list_missing_erasure_coded: local poolname=ecpool 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:920: list_missing_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:18:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:18:03.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:18:03.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:03.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:03.951 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:03.951 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:03.951 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:03.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:03.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:18:03.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:18:03.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:18:03.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:18:03.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:18:03.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:18:03.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:18:03.981 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:18:03.982 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:18:03.982 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:18:03.989 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:18:03.990 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:03.990 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:03.990 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:18:03.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:18:03.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:18:04.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:18:04.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:18:04.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:18:04.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:04.061 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:04.062 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:18:04.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:18:04.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:18:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:921: list_missing_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T23:18:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:18:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:18:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:18:04.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:18:04.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:18:04.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:18:04.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:18:04.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:04.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:04.245 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:04.246 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:04.246 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:04.246 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:04.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:18:04.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:18:04.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: seq 0 2 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 0 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:18:04.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:04.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:18:04.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:18:04.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:18:04.279 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 2e6345a9-9141-4b95-b973-7d574966e91d 2026-03-08T23:18:04.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2e6345a9-9141-4b95-b973-7d574966e91d 2026-03-08T23:18:04.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 2e6345a9-9141-4b95-b973-7d574966e91d' 2026-03-08T23:18:04.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:18:04.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCsA65pEnS6ERAABi86mGKHus5JsKoqC0spfg== 2026-03-08T23:18:04.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCsA65pEnS6ERAABi86mGKHus5JsKoqC0spfg=="}' 2026-03-08T23:18:04.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2e6345a9-9141-4b95-b973-7d574966e91d -i td/osd-scrub-repair/0/new.json 2026-03-08T23:18:04.396 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:18:04.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:18:04.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCsA65pEnS6ERAABi86mGKHus5JsKoqC0spfg== --osd-uuid 2e6345a9-9141-4b95-b973-7d574966e91d 2026-03-08T23:18:04.428 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:04.431+0000 7f54be6bf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:04.431 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:04.435+0000 7f54be6bf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:04.433 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:04.435+0000 7f54be6bf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:04.433 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:04.435+0000 7f54be6bf8c0 -1 bdev(0x55979e3aec00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:18:04.433 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:04.439+0000 7f54be6bf8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:18:06.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:18:06.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:18:06.716 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:18:06.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:18:06.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:18:06.822 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:18:06.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:18:06.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:18:06.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:18:06.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:18:06.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:06.868 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:06.867+0000 7f03924358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:06.881 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:06.887+0000 7f03924358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:06.896 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:06.895+0000 7f03924358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:06.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:07.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:07.846 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:07.851+0000 7f03924358c0 -1 Falling back to public interface 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:08.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:08.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:08.815 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:08.819+0000 7f03924358c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:09.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:09.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:10.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:10.698 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4276791263,v1:127.0.0.1:6803/4276791263] [v2:127.0.0.1:6804/4276791263,v1:127.0.0.1:6805/4276791263] exists,up 2e6345a9-9141-4b95-b973-7d574966e91d 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 1 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:10.699 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:10.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:10.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:10.700 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:10.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:18:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:18:10.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:18:10.703 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 f37e250f-6f5c-4892-a243-53d3af214a2d 2026-03-08T23:18:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f37e250f-6f5c-4892-a243-53d3af214a2d 2026-03-08T23:18:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 f37e250f-6f5c-4892-a243-53d3af214a2d' 2026-03-08T23:18:10.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:18:10.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCyA65pEYAbKxAAD/dD/Y3jCG9IcOrzroDhwQ== 2026-03-08T23:18:10.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCyA65pEYAbKxAAD/dD/Y3jCG9IcOrzroDhwQ=="}' 2026-03-08T23:18:10.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f37e250f-6f5c-4892-a243-53d3af214a2d -i td/osd-scrub-repair/1/new.json 2026-03-08T23:18:10.988 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:18:11.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:18:11.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCyA65pEYAbKxAAD/dD/Y3jCG9IcOrzroDhwQ== --osd-uuid f37e250f-6f5c-4892-a243-53d3af214a2d 2026-03-08T23:18:11.023 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:11.027+0000 7f0e0e85b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:11.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:11.031+0000 7f0e0e85b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:11.026 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:11.031+0000 7f0e0e85b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:11.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:11.031+0000 7f0e0e85b8c0 -1 bdev(0x56035f767c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:18:11.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:11.031+0000 7f0e0e85b8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:18:13.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:18:13.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:18:13.362 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:18:13.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:18:13.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:18:13.579 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:18:13.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:18:13.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:13.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:18:13.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:18:13.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:18:13.596 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:13.599+0000 7fb8e75fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:13.597 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:13.603+0000 7fb8e75fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:13.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:13.603+0000 7fb8e75fd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:13.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:13.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:13.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:14.810 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:14.815+0000 7fb8e75fd8c0 -1 Falling back to public interface 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:14.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:15.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:15.788 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:15.791+0000 7fb8e75fd8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:18:16.141 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:18:16.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:16.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:16.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:16.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:16.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:16.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:17.330 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:18:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:17.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:17.500 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2485803397,v1:127.0.0.1:6811/2485803397] [v2:127.0.0.1:6812/2485803397,v1:127.0.0.1:6813/2485803397] exists,up f37e250f-6f5c-4892-a243-53d3af214a2d 2026-03-08T23:18:17.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:17.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 2 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:17.501 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:18:17.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:18:17.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:18:17.504 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 48fb879e-a585-4694-9b02-f4e46f54caeb 2026-03-08T23:18:17.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=48fb879e-a585-4694-9b02-f4e46f54caeb 2026-03-08T23:18:17.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 48fb879e-a585-4694-9b02-f4e46f54caeb' 2026-03-08T23:18:17.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:18:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC5A65pryAuHxAAxf0qFIl5ms4Pf7Hrk1zGsA== 2026-03-08T23:18:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC5A65pryAuHxAAxf0qFIl5ms4Pf7Hrk1zGsA=="}' 2026-03-08T23:18:17.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 48fb879e-a585-4694-9b02-f4e46f54caeb -i td/osd-scrub-repair/2/new.json 2026-03-08T23:18:17.691 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:18:17.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:18:17.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQC5A65pryAuHxAAxf0qFIl5ms4Pf7Hrk1zGsA== --osd-uuid 48fb879e-a585-4694-9b02-f4e46f54caeb 2026-03-08T23:18:17.726 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:17.731+0000 7fd1a18b78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:17.728 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:17.731+0000 7fd1a18b78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:17.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:17.735+0000 7fd1a18b78c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:17.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:17.735+0000 7fd1a18b78c0 -1 bdev(0x55d121465c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:18:17.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:17.735+0000 7fd1a18b78c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:18:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:18:20.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:18:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:18:20.004 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:18:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:18:20.218 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:18:20.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:18:20.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:20.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:18:20.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:18:20.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:18:20.233 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:20.239+0000 7f3874a0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:20.242 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:20.247+0000 7f3874a0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:20.244 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:20.247+0000 7f3874a0b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:20.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:20.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:20.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:20.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:20.691+0000 7f3874a0b8c0 -1 Falling back to public interface 2026-03-08T23:18:21.574 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:18:21.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:21.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:21.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:21.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:21.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:21.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:21.675+0000 7f3874a0b8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:18:21.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:22.776 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:18:22.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:22.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:22.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:22.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:22.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:23.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:24.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:24.172 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3850682943,v1:127.0.0.1:6819/3850682943] [v2:127.0.0.1:6820/3850682943,v1:127.0.0.1:6821/3850682943] exists,up 48fb879e-a585-4694-9b02-f4e46f54caeb 2026-03-08T23:18:24.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:24.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:24.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:24.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:925: list_missing_erasure_coded: create_rbd_pool 2026-03-08T23:18:24.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:18:24.342 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:18:24.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:18:24.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:18:24.578 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:18:24.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:18:25.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:926: list_missing_erasure_coded: wait_for_clean 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:18:25.893 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:18:25.894 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:18:25.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:18:25.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:18:25.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:18:25.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:18:25.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:18:25.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:18:25.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:18:25.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:18:25.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:26.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:18:26.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:18:26.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:18:26.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:18:26.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:26.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:18:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705667 2026-03-08T23:18:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705667 2026-03-08T23:18:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705667' 2026-03-08T23:18:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:26.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542146 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542146 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-38654705667 2-60129542146' 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:18:26.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:26.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:18:26.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:18:26.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:26.405 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:18:26.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:18:26.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:18:26.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:26.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836485 2026-03-08T23:18:26.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:18:27.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:18:27.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:27.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T23:18:27.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:27.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705667 2026-03-08T23:18:27.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:27.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:18:27.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705667 2026-03-08T23:18:27.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:27.766 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705667 2026-03-08T23:18:27.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705667 2026-03-08T23:18:27.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705667' 2026-03-08T23:18:27.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:18:27.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705667 -lt 38654705667 2026-03-08T23:18:27.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:27.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:27.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542146 2026-03-08T23:18:27.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:18:27.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542146 2026-03-08T23:18:27.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:27.951 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542146 2026-03-08T23:18:27.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542146 2026-03-08T23:18:27.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542146' 2026-03-08T23:18:27.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:18:28.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542146 -lt 60129542146 2026-03-08T23:18:28.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:18:28.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:28.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:18:28.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:18:28.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:18:28.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:18:28.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:18:28.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:28.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:928: list_missing_erasure_coded: create_ec_pool ecpool false k=2 m=1 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:18:28.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 2026-03-08T23:18:28.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:18:28.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:18:29.234 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:18:29.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:18:30.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T23:18:30.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:18:30.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:18:30.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:18:30.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:18:30.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:18:30.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:18:30.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:18:30.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:18:30.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:18:30.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:18:30.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:18:30.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:18:30.493 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:30.493 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:18:30.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:18:30.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:30.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:18:30.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T23:18:30.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T23:18:30.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T23:18:30.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:30.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:18:30.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705669 2026-03-08T23:18:30.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705669 2026-03-08T23:18:30.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-38654705669' 2026-03-08T23:18:30.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:30.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:18:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542148 2026-03-08T23:18:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542148 2026-03-08T23:18:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-38654705669 2-60129542148' 2026-03-08T23:18:30.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:30.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T23:18:30.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:30.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:18:30.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T23:18:30.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:30.750 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T23:18:30.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T23:18:30.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T23:18:30.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:30.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T23:18:30.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:18:31.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:18:31.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:32.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T23:18:32.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:32.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705669 2026-03-08T23:18:32.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:32.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:18:32.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705669 2026-03-08T23:18:32.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:32.080 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705669 2026-03-08T23:18:32.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705669 2026-03-08T23:18:32.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705669' 2026-03-08T23:18:32.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:18:32.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705669 -lt 38654705669 2026-03-08T23:18:32.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:32.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542148 2026-03-08T23:18:32.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:32.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:18:32.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542148 2026-03-08T23:18:32.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:32.263 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542148 2026-03-08T23:18:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542148 2026-03-08T23:18:32.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542148' 2026-03-08T23:18:32.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:18:32.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542148 -lt 60129542148 2026-03-08T23:18:32.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:18:32.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:32.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:18:32.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:18:32.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:18:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:18:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:32.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:931: list_missing_erasure_coded: add_something td/osd-scrub-repair ecpool MOBJ0 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=MOBJ0 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:18:32.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:18:33.181 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:18:33.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:18:33.383 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:18:33.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:18:33.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:18:33.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put MOBJ0 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:18:33.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: get_osds ecpool MOBJ0 2026-03-08T23:18:33.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:18:33.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=MOBJ0 2026-03-08T23:18:33.438 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool MOBJ0 2026-03-08T23:18:33.438 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:18:33.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 1 0 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: osds0=('2' '1' '0') 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: local -a osds0 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:935: list_missing_erasure_coded: add_something td/osd-scrub-repair ecpool MOBJ1 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=MOBJ1 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:18:33.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:18:33.814 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:18:33.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:18:34.022 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:18:34.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:18:34.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:18:34.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put MOBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:18:34.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: get_osds ecpool MOBJ1 2026-03-08T23:18:34.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:18:34.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=MOBJ1 2026-03-08T23:18:34.062 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool MOBJ1 2026-03-08T23:18:34.062 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 1 0 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: osds1=('2' '1' '0') 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: local -a osds1 2026-03-08T23:18:34.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: seq 0 2 2026-03-08T23:18:34.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:34.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:18:34.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:18:34.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:18:34.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:18:34.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:18:34.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:18:34.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:18:34.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:18:34.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:18:34.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:18:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:18:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:943: list_missing_erasure_coded: id=2 2026-03-08T23:18:34.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:944: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 MOBJ0 remove 2026-03-08T23:18:35.614 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:a9030505:::MOBJ0:head# 2026-03-08T23:18:36.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:946: list_missing_erasure_coded: id=1 2026-03-08T23:18:36.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:947: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 MOBJ0 remove 2026-03-08T23:18:36.804 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:a9030505:::MOBJ0:head# 2026-03-08T23:18:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:950: list_missing_erasure_coded: id=1 2026-03-08T23:18:37.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:951: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 MOBJ1 remove 2026-03-08T23:18:37.965 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:8dd16d55:::MOBJ1:head# 2026-03-08T23:18:38.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:953: list_missing_erasure_coded: id=0 2026-03-08T23:18:38.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:954: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 MOBJ1 remove 2026-03-08T23:18:39.149 INFO:tasks.workunit.client.0.vm03.stdout:remove 2#2:8dd16d55:::MOBJ1:head# 2026-03-08T23:18:39.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: seq 0 2 2026-03-08T23:18:39.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:39.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 0 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:39.686 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:18:39.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:18:39.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:18:39.688 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:18:39.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:39.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:18:39.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:18:39.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:18:39.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:18:39.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:18:39.708 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:39.711+0000 7ff165b0e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:39.708 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:39.711+0000 7ff165b0e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:39.710 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:39.715+0000 7ff165b0e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:39.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:18:39.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:39.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:40.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:40.662 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:40.667+0000 7ff165b0e8c0 -1 Falling back to public interface 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:41.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:41.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:41.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:41.643+0000 7ff165b0e8c0 -1 osd.0 31 log_to_monitors true 2026-03-08T23:18:42.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:42.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:42.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:42.221 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:18:42.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:42.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:42.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:43.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:18:43.560 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 38 up_thru 38 down_at 32 last_clean_interval [5,31) [v2:127.0.0.1:6802/2410716718,v1:127.0.0.1:6803/2410716718] [v2:127.0.0.1:6804/2410716718,v1:127.0.0.1:6805/2410716718] exists,up 2e6345a9-9141-4b95-b973-7d574966e91d 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 1 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:43.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:18:43.562 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:18:43.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:43.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:18:43.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:18:43.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:18:43.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:18:43.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:18:43.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:43.583+0000 7f896d93b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:43.579 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:43.583+0000 7f896d93b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:43.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:43.583+0000 7f896d93b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:43.749 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:18:43.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:43.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:43.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:44.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:44.291+0000 7f896d93b8c0 -1 Falling back to public interface 2026-03-08T23:18:44.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:44.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:44.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:44.920 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:44.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:44.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:45.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:45.506 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:45.511+0000 7f896d93b8c0 -1 osd.1 33 log_to_monitors true 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:46.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:46.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:46.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:46.603+0000 7f89648eb640 -1 osd.1 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:47.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:18:47.441 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 41 up_thru 41 down_at 34 last_clean_interval [9,33) [v2:127.0.0.1:6810/4023029264,v1:127.0.0.1:6811/4023029264] [v2:127.0.0.1:6812/4023029264,v1:127.0.0.1:6813/4023029264] exists,up f37e250f-6f5c-4892-a243-53d3af214a2d 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 2 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:18:47.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:18:47.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:18:47.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:18:47.444 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:18:47.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:18:47.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:18:47.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:18:47.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:18:47.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:18:47.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:18:47.460 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:47.463+0000 7f3738ea68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:47.461 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:47.467+0000 7f3738ea68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:47.463 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:47.467+0000 7f3738ea68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:47.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:47.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:48.673 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:48.679+0000 7f3738ea68c0 -1 Falling back to public interface 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:48.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:48.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:49.896 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:49.899+0000 7f3738ea68c0 -1 osd.2 35 log_to_monitors true 2026-03-08T23:18:49.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:49.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:49.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:18:49.978 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:18:49.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:49.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:50.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:18:50.888 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:18:50.891+0000 7f372fe56640 -1 osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:18:51.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 44 up_thru 44 down_at 36 last_clean_interval [14,35) [v2:127.0.0.1:6818/3453671689,v1:127.0.0.1:6819/3453671689] [v2:127.0.0.1:6820/3453671689,v1:127.0.0.1:6821/3453671689] exists,up 48fb879e-a585-4694-9b02-f4e46f54caeb 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:960: list_missing_erasure_coded: create_rbd_pool 2026-03-08T23:18:51.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:18:51.570 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' removed 2026-03-08T23:18:51.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:18:51.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:18:51.783 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:18:51.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:18:52.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:18:53.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:961: list_missing_erasure_coded: wait_for_clean 2026-03-08T23:18:53.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:18:53.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:18:53.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:18:53.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:18:53.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:18:53.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:18:53.328 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:18:53.328 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:18:53.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:18:53.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:53.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:18:53.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=163208757252 2026-03-08T23:18:53.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 163208757252 2026-03-08T23:18:53.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252' 2026-03-08T23:18:53.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:53.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:18:53.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659139 2026-03-08T23:18:53.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659139 2026-03-08T23:18:53.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252 1-176093659139' 2026-03-08T23:18:53.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:18:53.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:18:53.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=188978561026 2026-03-08T23:18:53.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 188978561026 2026-03-08T23:18:53.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252 1-176093659139 2-188978561026' 2026-03-08T23:18:53.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:53.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-163208757252 2026-03-08T23:18:53.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:53.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:18:53.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-163208757252 2026-03-08T23:18:53.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:53.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=163208757252 2026-03-08T23:18:53.579 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 163208757252 2026-03-08T23:18:53.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 163208757252' 2026-03-08T23:18:53.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:53.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757251 -lt 163208757252 2026-03-08T23:18:53.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:18:54.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:18:54.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:54.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757251 -lt 163208757252 2026-03-08T23:18:54.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:18:55.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:18:55.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:18:56.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757252 -lt 163208757252 2026-03-08T23:18:56.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:56.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-176093659139 2026-03-08T23:18:56.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:56.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:18:56.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-176093659139 2026-03-08T23:18:56.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:56.102 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 176093659139 2026-03-08T23:18:56.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659139 2026-03-08T23:18:56.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 176093659139' 2026-03-08T23:18:56.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:18:56.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659139 -lt 176093659139 2026-03-08T23:18:56.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:18:56.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-188978561026 2026-03-08T23:18:56.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:18:56.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:18:56.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-188978561026 2026-03-08T23:18:56.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:18:56.271 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 188978561026 2026-03-08T23:18:56.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=188978561026 2026-03-08T23:18:56.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 188978561026' 2026-03-08T23:18:56.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:18:56.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 188978561026 -lt 188978561026 2026-03-08T23:18:56.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:18:56.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:56.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:56.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:18:56.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:18:56.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:18:56.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:18:56.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:18:56.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:964: list_missing_erasure_coded: get_pg ecpool MOBJ0 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=MOBJ0 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool MOBJ0 2026-03-08T23:18:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:964: list_missing_erasure_coded: local pg=2.0 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:968: list_missing_erasure_coded: repair 2.0 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:18:57.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:57.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:57.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:18:57.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:18:29.237872+0000 2026-03-08T23:18:57.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:18:57.500 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.2 to repair 2026-03-08T23:18:57.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:18:29.237872+0000 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:18:29.237872+0000 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:57.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:18:57.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:18:29.237872+0000 '>' 2026-03-08T23:18:29.237872+0000 2026-03-08T23:18:57.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:18:58.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:18:58.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:58.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:18:58.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:18:58.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:58.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:58.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:18:58.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:18:29.237872+0000 '>' 2026-03-08T23:18:29.237872+0000 2026-03-08T23:18:58.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:18:59.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:18:59.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:18:59.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:18:59.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:18:59.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:18:59.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:18:59.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:00.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:18:29.237872+0000 '>' 2026-03-08T23:18:29.237872+0000 2026-03-08T23:19:00.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:19:01.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:01.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:18:58.001898+0000 '>' 2026-03-08T23:18:29.237872+0000 2026-03-08T23:19:01.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:19:01.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:970: list_missing_erasure_coded: seq 0 120 2026-03-08T23:19:01.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:970: list_missing_erasure_coded: for i in $(seq 0 120) 2026-03-08T23:19:01.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:971: list_missing_erasure_coded: '[' 0 -lt 60 ']' 2026-03-08T23:19:01.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: ceph pg 2.0 list_unfound 2026-03-08T23:19:01.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: wc -l 2026-03-08T23:19:01.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: egrep 'MOBJ0|MOBJ1' 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: matches=2 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:973: list_missing_erasure_coded: '[' 2 -eq 2 ']' 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:973: list_missing_erasure_coded: break 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:19:01.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:19:01.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:19:01.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:19:01.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:19:01.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:19:01.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:19:01.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:19:01.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:19:01.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:19:01.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:19:01.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:19:01.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:19:01.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:19:01.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:19:01.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:19:01.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:19:01.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:19:01.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:19:01.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:19:01.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:19:01.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:19:01.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:19:01.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:19:01.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:19:01.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:19:01.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:19:01.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:19:01.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:19:01.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:19:01.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:19:01.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:19:01.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:19:01.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:19:01.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:19:01.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:19:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:19:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:19:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:19:01.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:19:01.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:19:01.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:19:01.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:19:01.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:19:01.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:19:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:19:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:19:01.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:19:01.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:19:01.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_list_missing_erasure_coded_overwrites td/osd-scrub-repair 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:982: TEST_list_missing_erasure_coded_overwrites: '[' true = true ']' 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:983: TEST_list_missing_erasure_coded_overwrites: list_missing_erasure_coded td/osd-scrub-repair true 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:916: list_missing_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:917: list_missing_erasure_coded: local allow_overwrites=true 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:918: list_missing_erasure_coded: local poolname=ecpool 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:920: list_missing_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:19:01.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:19:01.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:19:01.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:01.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:01.472 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:01.473 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.473 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:01.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:19:01.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:19:01.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:19:01.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:19:01.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:19:01.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:19:01.505 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.506 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.506 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:19:01.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:19:01.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:19:01.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:19:01.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:19:01.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:19:01.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:19:01.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:19:01.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:921: list_missing_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:19:01.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:01.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:19:01.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:19:01.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: seq 0 2 2026-03-08T23:19:01.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 0 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:01.772 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:01.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:01.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:19:01.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:19:01.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:19:01.783 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 4b5cc35b-fc49-4897-a0b5-07cc55270c65 2026-03-08T23:19:01.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4b5cc35b-fc49-4897-a0b5-07cc55270c65 2026-03-08T23:19:01.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 4b5cc35b-fc49-4897-a0b5-07cc55270c65' 2026-03-08T23:19:01.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:19:01.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDlA65pRCEJMBAAgxClCUedvR1ocTDDunnM8g== 2026-03-08T23:19:01.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDlA65pRCEJMBAAgxClCUedvR1ocTDDunnM8g=="}' 2026-03-08T23:19:01.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4b5cc35b-fc49-4897-a0b5-07cc55270c65 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:19:01.900 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:19:01.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:19:01.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDlA65pRCEJMBAAgxClCUedvR1ocTDDunnM8g== --osd-uuid 4b5cc35b-fc49-4897-a0b5-07cc55270c65 2026-03-08T23:19:01.933 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:01.935+0000 7fc95b9cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:01.938 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:01.943+0000 7fc95b9cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:01.941 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:01.943+0000 7fc95b9cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:01.941 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:01.943+0000 7fc95b9cf8c0 -1 bdev(0x560c44a72c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:19:01.941 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:01.943+0000 7fc95b9cf8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:19:04.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:19:04.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:19:04.210 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:19:04.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:19:04.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:19:04.318 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:19:04.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:19:04.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:19:04.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:19:04.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:19:04.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:04.381 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:04.383+0000 7fb845b678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:04.381 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:04.387+0000 7fb845b678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:04.384 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:04.387+0000 7fb845b678c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:04.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:04.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:05.581 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:05.587+0000 7fb845b678c0 -1 Falling back to public interface 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:05.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:06.555 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:06.559+0000 7fb845b678c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:19:06.828 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:19:06.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:06.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:06.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:06.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:06.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:07.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:07.641 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:07.647+0000 7fb841320640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:19:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:08.030 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:19:08.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:08.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:08.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3479484316,v1:127.0.0.1:6803/3479484316] [v2:127.0.0.1:6804/3479484316,v1:127.0.0.1:6805/3479484316] exists,up 4b5cc35b-fc49-4897-a0b5-07cc55270c65 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 1 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:19:08.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:19:08.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:19:08.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2e606676-aace-4794-bbab-ed2aee73a8ef 2026-03-08T23:19:08.207 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 2e606676-aace-4794-bbab-ed2aee73a8ef 2026-03-08T23:19:08.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 2e606676-aace-4794-bbab-ed2aee73a8ef' 2026-03-08T23:19:08.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:19:08.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDsA65pITV1DRAAqjJDrtY+XDCWREys7+sh7g== 2026-03-08T23:19:08.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDsA65pITV1DRAAqjJDrtY+XDCWREys7+sh7g=="}' 2026-03-08T23:19:08.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2e606676-aace-4794-bbab-ed2aee73a8ef -i td/osd-scrub-repair/1/new.json 2026-03-08T23:19:08.389 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:19:08.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:19:08.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDsA65pITV1DRAAqjJDrtY+XDCWREys7+sh7g== --osd-uuid 2e606676-aace-4794-bbab-ed2aee73a8ef 2026-03-08T23:19:08.422 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:08.427+0000 7fd7cb0758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:08.424 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:08.431+0000 7fd7cb0758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:08.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:08.431+0000 7fd7cb0758c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:08.426 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:08.431+0000 7fd7cb0758c0 -1 bdev(0x555ac8189c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:19:08.426 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:08.431+0000 7fd7cb0758c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:19:10.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:19:10.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:19:10.687 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:19:10.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:19:10.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:19:10.895 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:19:10.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:19:10.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:10.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:19:10.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:19:10.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:19:10.912 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:10.915+0000 7fe6eacc08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:10.912 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:10.915+0000 7fe6eacc08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:10.914 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:10.919+0000 7fe6eacc08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:11.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:19:11.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:11.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:19:11.086 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:11.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:11.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:11.869 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:11.875+0000 7fe6eacc08c0 -1 Falling back to public interface 2026-03-08T23:19:12.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:12.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:12.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:12.262 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:19:12.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:12.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:12.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:12.832 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:12.835+0000 7fe6eacc08c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:13.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:13.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:13.966 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:13.971+0000 7fe6e6479640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/899380782,v1:127.0.0.1:6811/899380782] [v2:127.0.0.1:6812/899380782,v1:127.0.0.1:6813/899380782] exists,up 2e606676-aace-4794-bbab-ed2aee73a8ef 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:922: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:923: list_missing_erasure_coded: run_osd td/osd-scrub-repair 2 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:14.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:14.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:19:14.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:19:14.797 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:19:14.798 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 2026-03-08T23:19:14.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 2026-03-08T23:19:14.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1' 2026-03-08T23:19:14.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:19:14.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDyA65pO0nAMBAAo1J5vYMoYX12ZKMSkC7CMw== 2026-03-08T23:19:14.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDyA65pO0nAMBAAo1J5vYMoYX12ZKMSkC7CMw=="}' 2026-03-08T23:19:14.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:19:14.981 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:19:14.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:19:14.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDyA65pO0nAMBAAo1J5vYMoYX12ZKMSkC7CMw== --osd-uuid 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 2026-03-08T23:19:15.013 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:15.019+0000 7f7c8f8c98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:15.015 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:15.019+0000 7f7c8f8c98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:15.016 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:15.023+0000 7f7c8f8c98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:15.017 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:15.023+0000 7f7c8f8c98c0 -1 bdev(0x55a8dc485c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:19:15.017 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:15.023+0000 7f7c8f8c98c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:19:17.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:19:17.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:19:17.766 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:19:17.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:19:17.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:19:17.974 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:19:17.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:19:17.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:17.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:19:17.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:19:17.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:19:18.005 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:18.011+0000 7fcc0157d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:18.016 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:18.023+0000 7fcc0157d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:18.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:18.023+0000 7fcc0157d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:18.165 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:18.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:18.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:18.977 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:18.983+0000 7fcc0157d8c0 -1 Falling back to public interface 2026-03-08T23:19:19.343 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:19:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:19.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:19.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:20.469 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:20.475+0000 7fcc0157d8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:19:20.518 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:19:20.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:20.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:20.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:20.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:20.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:20.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:21.725 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:19:21.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:21.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:21.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:21.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:21.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:21.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:22.926 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:19:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:19:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2234661719,v1:127.0.0.1:6819/2234661719] [v2:127.0.0.1:6820/2234661719,v1:127.0.0.1:6821/2234661719] exists,up 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:925: list_missing_erasure_coded: create_rbd_pool 2026-03-08T23:19:23.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:19:23.387 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:19:23.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:19:23.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:19:23.656 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:19:23.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:19:24.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:19:25.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:926: list_missing_erasure_coded: wait_for_clean 2026-03-08T23:19:25.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:19:25.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:19:25.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:19:25.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:19:25.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:19:25.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:19:25.644 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:25.645 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:19:25.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:19:25.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:25.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:19:25.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:19:25.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:19:25.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:19:25.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:25.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:19:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T23:19:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T23:19:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T23:19:25.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:25.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964 2-64424509442' 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:19:25.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:25.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:19:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:19:25.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:25.903 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:19:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:19:25.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:19:25.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:19:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:19:26.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:19:27.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:19:27.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:19:27.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T23:19:27.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:27.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T23:19:27.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:27.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:19:27.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T23:19:27.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:27.241 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T23:19:27.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T23:19:27.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T23:19:27.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:19:27.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T23:19:27.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:27.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T23:19:27.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:27.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:19:27.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T23:19:27.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:27.426 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T23:19:27.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T23:19:27.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T23:19:27.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:19:27.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T23:19:27.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:19:27.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:27.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:19:27.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:19:27.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:19:27.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:19:27.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:27.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:928: list_missing_erasure_coded: create_ec_pool ecpool true k=2 m=1 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:19:28.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 2026-03-08T23:19:28.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:19:28.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:19:28.720 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:19:28.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:19:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T23:19:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T23:19:29.947 INFO:tasks.workunit.client.0.vm03.stderr:set pool 2 allow_ec_overwrites to true 2026-03-08T23:19:29.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:19:29.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:19:29.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:19:29.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:19:29.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:19:29.964 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:19:29.964 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:19:29.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:19:29.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:19:29.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:19:30.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:19:30.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:19:30.198 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:30.198 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:19:30.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:19:30.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:30.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:19:30.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:19:30.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:19:30.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:19:30.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:30.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:19:30.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672966 2026-03-08T23:19:30.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672966 2026-03-08T23:19:30.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672966' 2026-03-08T23:19:30.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:30.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:19:30.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T23:19:30.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T23:19:30.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-42949672966 2-64424509444' 2026-03-08T23:19:30.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:30.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:19:30.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:30.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:19:30.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:19:30.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:30.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:19:30.452 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T23:19:30.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:19:30.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:19:30.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T23:19:30.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:30.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672966 2026-03-08T23:19:30.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:30.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:19:30.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672966 2026-03-08T23:19:30.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:30.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672966 2026-03-08T23:19:30.628 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672966 2026-03-08T23:19:30.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672966' 2026-03-08T23:19:30.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:19:30.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672966 2026-03-08T23:19:30.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:30.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T23:19:30.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:30.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:19:30.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T23:19:30.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:30.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T23:19:30.802 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509444 2026-03-08T23:19:30.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T23:19:30.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:19:30.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509443 -lt 64424509444 2026-03-08T23:19:30.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:19:31.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:19:31.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:19:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509443 -lt 64424509444 2026-03-08T23:19:32.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:19:33.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:19:33.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:19:33.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509444 2026-03-08T23:19:33.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:19:33.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:33.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:19:33.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:19:33.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:19:33.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:19:33.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:19:33.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:33.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:931: list_missing_erasure_coded: add_something td/osd-scrub-repair ecpool MOBJ0 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=MOBJ0 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:19:33.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:19:34.033 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:19:34.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:19:34.235 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:19:34.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:19:34.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:19:34.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put MOBJ0 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:19:34.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: get_osds ecpool MOBJ0 2026-03-08T23:19:34.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:19:34.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=MOBJ0 2026-03-08T23:19:34.275 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool MOBJ0 2026-03-08T23:19:34.275 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:19:34.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 1 0 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: osds0=('2' '1' '0') 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:932: list_missing_erasure_coded: local -a osds0 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:935: list_missing_erasure_coded: add_something td/osd-scrub-repair ecpool MOBJ1 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=MOBJ1 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:19:34.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:19:34.657 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:19:34.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:19:34.863 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:19:34.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:19:34.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:19:34.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put MOBJ1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:19:34.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: get_osds ecpool MOBJ1 2026-03-08T23:19:34.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:19:34.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=MOBJ1 2026-03-08T23:19:34.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool MOBJ1 2026-03-08T23:19:34.907 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=2 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr:0' 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 2 1 0 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: osds1=('2' '1' '0') 2026-03-08T23:19:35.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:936: list_missing_erasure_coded: local -a osds1 2026-03-08T23:19:35.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: seq 0 2 2026-03-08T23:19:35.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:35.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:19:35.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:19:35.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:19:35.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:19:35.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:19:35.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:19:35.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:19:35.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:35.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:19:35.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:19:35.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:19:35.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:19:35.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:19:35.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:19:35.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:19:35.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:939: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:35.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:940: list_missing_erasure_coded: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:19:35.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:19:35.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:19:35.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:19:35.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:19:35.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:19:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:19:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:943: list_missing_erasure_coded: id=2 2026-03-08T23:19:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:944: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 MOBJ0 remove 2026-03-08T23:19:36.087 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#2:a9030505:::MOBJ0:head# 2026-03-08T23:19:36.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:946: list_missing_erasure_coded: id=1 2026-03-08T23:19:36.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:947: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 MOBJ0 remove 2026-03-08T23:19:37.318 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:a9030505:::MOBJ0:head# 2026-03-08T23:19:37.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:950: list_missing_erasure_coded: id=1 2026-03-08T23:19:37.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:951: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 MOBJ1 remove 2026-03-08T23:19:38.485 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#2:8dd16d55:::MOBJ1:head# 2026-03-08T23:19:39.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:953: list_missing_erasure_coded: id=0 2026-03-08T23:19:39.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:954: list_missing_erasure_coded: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 MOBJ1 remove 2026-03-08T23:19:39.678 INFO:tasks.workunit.client.0.vm03.stdout:remove 2#2:8dd16d55:::MOBJ1:head# 2026-03-08T23:19:40.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: seq 0 2 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 0 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:40.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:19:40.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:19:40.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:19:40.215 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:19:40.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:40.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:19:40.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:19:40.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:19:40.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:19:40.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:19:40.232 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:40.235+0000 7effc44b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:40.232 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:40.235+0000 7effc44b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:40.233 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:40.239+0000 7effc44b68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:40.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:40.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:41.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:41.431+0000 7effc44b68c0 -1 Falling back to public interface 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:41.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:41.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:42.396 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:42.399+0000 7effc44b68c0 -1 osd.0 33 log_to_monitors true 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:42.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:42.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:43.355 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:43.359+0000 7effbb466640 -1 osd.0 33 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:43.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 38 up_thru 38 down_at 34 last_clean_interval [5,33) [v2:127.0.0.1:6802/940901740,v1:127.0.0.1:6803/940901740] [v2:127.0.0.1:6804/940901740,v1:127.0.0.1:6805/940901740] exists,up 4b5cc35b-fc49-4897-a0b5-07cc55270c65 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 1 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:19:44.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:44.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:19:44.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:19:44.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:19:44.072 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:19:44.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:44.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:19:44.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:19:44.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:19:44.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:19:44.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:19:44.088 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:44.091+0000 7f285df308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:44.089 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:44.095+0000 7f285df308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:44.093 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:44.099+0000 7f285df308c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:19:44.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:44.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:44.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:45.057 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:45.063+0000 7f285df308c0 -1 Falling back to public interface 2026-03-08T23:19:45.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:45.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:45.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:45.412 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:45.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:45.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:45.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:46.016 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:46.019+0000 7f285df308c0 -1 osd.1 34 log_to_monitors true 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:46.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:46.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:47.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:47.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:47.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:47.753 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:19:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:47.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:19:47.921 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 41 up_thru 41 down_at 35 last_clean_interval [10,34) [v2:127.0.0.1:6810/3168316076,v1:127.0.0.1:6811/3168316076] [v2:127.0.0.1:6812/3168316076,v1:127.0.0.1:6813/3168316076] exists,up 2e606676-aace-4794-bbab-ed2aee73a8ef 2026-03-08T23:19:47.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:47.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:47.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:957: list_missing_erasure_coded: for id in $(seq 0 2) 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:958: list_missing_erasure_coded: activate_osd td/osd-scrub-repair 2 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:19:47.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:19:47.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:19:47.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:19:47.924 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:19:47.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:19:47.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:19:47.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:19:47.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:19:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:19:47.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:19:47.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:47.943+0000 7fb296d3a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:47.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:47.947+0000 7fb296d3a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:47.942 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:47.947+0000 7fb296d3a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:48.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:48.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:49.153 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:49.159+0000 7fb296d3a8c0 -1 Falling back to public interface 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:49.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:49.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:50.115 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:19:50.119+0000 7fb296d3a8c0 -1 osd.2 35 log_to_monitors true 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:50.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:50.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:19:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:19:51.768 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 45 up_thru 45 down_at 36 last_clean_interval [15,35) [v2:127.0.0.1:6818/3044380166,v1:127.0.0.1:6819/3044380166] [v2:127.0.0.1:6820/3044380166,v1:127.0.0.1:6821/3044380166] exists,up 15d4cb8c-ffe2-4b9f-aa6d-a8ec70e57ba1 2026-03-08T23:19:51.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:19:51.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:19:51.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:19:51.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:960: list_missing_erasure_coded: create_rbd_pool 2026-03-08T23:19:51.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:19:51.983 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' removed 2026-03-08T23:19:52.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:19:52.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:19:52.199 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:19:52.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:19:53.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:19:53.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:961: list_missing_erasure_coded: wait_for_clean 2026-03-08T23:19:53.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:19:53.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:19:53.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:19:53.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:19:53.533 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:19:53.533 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:19:53.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:19:53.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:19:53.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:19:53.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:53.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:19:53.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=163208757252 2026-03-08T23:19:53.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 163208757252 2026-03-08T23:19:53.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252' 2026-03-08T23:19:53.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:53.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:19:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=176093659139 2026-03-08T23:19:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 176093659139 2026-03-08T23:19:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252 1-176093659139' 2026-03-08T23:19:53.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:19:53.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528322 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528322 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-163208757252 1-176093659139 2-193273528322' 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-163208757252 2026-03-08T23:19:53.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:53.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:19:53.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-163208757252 2026-03-08T23:19:53.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:54.000 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 163208757252 2026-03-08T23:19:54.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=163208757252 2026-03-08T23:19:54.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 163208757252' 2026-03-08T23:19:54.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:19:54.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757250 -lt 163208757252 2026-03-08T23:19:54.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:19:55.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:19:55.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:19:55.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 163208757252 -lt 163208757252 2026-03-08T23:19:55.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:55.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-176093659139 2026-03-08T23:19:55.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:55.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:19:55.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-176093659139 2026-03-08T23:19:55.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:55.345 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 176093659139 2026-03-08T23:19:55.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=176093659139 2026-03-08T23:19:55.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 176093659139' 2026-03-08T23:19:55.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:19:55.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 176093659139 -lt 176093659139 2026-03-08T23:19:55.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:19:55.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-193273528322 2026-03-08T23:19:55.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:19:55.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:19:55.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-193273528322 2026-03-08T23:19:55.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:19:55.519 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 193273528322 2026-03-08T23:19:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528322 2026-03-08T23:19:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 193273528322' 2026-03-08T23:19:55.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:19:55.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528322 -lt 193273528322 2026-03-08T23:19:55.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:19:55.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:55.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:19:55.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:19:56.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:19:56.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:19:56.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:19:56.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:19:56.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:19:56.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:964: list_missing_erasure_coded: get_pg ecpool MOBJ0 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=MOBJ0 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool MOBJ0 2026-03-08T23:19:56.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:964: list_missing_erasure_coded: local pg=2.0 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:968: list_missing_erasure_coded: repair 2.0 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=2.0 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 2.0 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:19:56.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:19:56.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:56.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:56.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 2.0 2026-03-08T23:19:56.767 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0s0 on osd.2 to repair 2026-03-08T23:19:56.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 2.0 2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:56.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:19:56.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:56.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:19:28.724584+0000 '>' 2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:56.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:19:57.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:19:57.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:58.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:19:28.724584+0000 '>' 2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:58.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:19:59.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:19:59.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:19:59.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:19:59.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:19:59.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:19:59.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:19:59.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:19:59.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:19:28.724584+0000 '>' 2026-03-08T23:19:28.724584+0000 2026-03-08T23:19:59.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:00.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:00.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:00.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:00.281 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:00.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:00.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:00.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:00.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:19:57.204109+0000 '>' 2026-03-08T23:19:28.724584+0000 2026-03-08T23:20:00.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:20:00.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:970: list_missing_erasure_coded: seq 0 120 2026-03-08T23:20:00.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:970: list_missing_erasure_coded: for i in $(seq 0 120) 2026-03-08T23:20:00.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:971: list_missing_erasure_coded: '[' 0 -lt 60 ']' 2026-03-08T23:20:00.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: ceph pg 2.0 list_unfound 2026-03-08T23:20:00.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: wc -l 2026-03-08T23:20:00.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: egrep 'MOBJ0|MOBJ1' 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:972: list_missing_erasure_coded: matches=2 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:973: list_missing_erasure_coded: '[' 2 -eq 2 ']' 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:973: list_missing_erasure_coded: break 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:20:00.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:20:00.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:20:00.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:20:00.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:20:00.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:20:00.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:20:00.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:20:00.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:20:00.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:20:00.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:20:00.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:20:00.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:20:00.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:20:00.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:20:00.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:20:00.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:20:00.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:20:00.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:20:00.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:20:00.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:20:00.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:20:00.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:20:00.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:20:00.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:20:00.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:20:00.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:20:00.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:20:00.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:20:00.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:20:00.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:20:00.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:20:00.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:20:00.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:20:00.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:20:00.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:20:00.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:20:00.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:20:00.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:20:00.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:20:00.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:20:00.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:20:00.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:20:00.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:20:00.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:20:00.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:20:00.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_periodic_scrub_replicated td/osd-scrub-repair 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5759: TEST_periodic_scrub_replicated: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5760: TEST_periodic_scrub_replicated: local poolname=psr_pool 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5761: TEST_periodic_scrub_replicated: local objname=POBJ 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5763: TEST_periodic_scrub_replicated: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:20:00.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.739 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:00.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:20:00.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:20:00.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:20:00.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:20:00.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:20:00.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.773 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.774 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:20:00.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:20:00.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:00.852 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:00.853 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:20:00.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:20:00.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:20:00.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5764: TEST_periodic_scrub_replicated: run_mgr td/osd-scrub-repair x 2026-03-08T23:20:00.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:20:00.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:20:00.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:20:00.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:20:00.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:20:00.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:20:01.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:01.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:20:01.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5765: TEST_periodic_scrub_replicated: local 'ceph_osd_args=--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 ' 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5766: TEST_periodic_scrub_replicated: ceph_osd_args+=--osd_scrub_backoff_ratio=0 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5767: TEST_periodic_scrub_replicated: run_osd td/osd-scrub-repair 0 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:20:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:01.056 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0' 2026-03-08T23:20:01.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:20:01.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:20:01.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=633422ab-f353-4b39-aeaf-6460d696a517 2026-03-08T23:20:01.066 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 633422ab-f353-4b39-aeaf-6460d696a517 2026-03-08T23:20:01.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 633422ab-f353-4b39-aeaf-6460d696a517' 2026-03-08T23:20:01.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:20:01.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAhBK5poY01BRAAeSJ/R+8Ix9WJvSTGv4td4w== 2026-03-08T23:20:01.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAhBK5poY01BRAAeSJ/R+8Ix9WJvSTGv4td4w=="}' 2026-03-08T23:20:01.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 633422ab-f353-4b39-aeaf-6460d696a517 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:20:01.194 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:20:01.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:20:01.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --mkfs --key AQAhBK5poY01BRAAeSJ/R+8Ix9WJvSTGv4td4w== --osd-uuid 633422ab-f353-4b39-aeaf-6460d696a517 2026-03-08T23:20:01.226 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:01.231+0000 7f6a737908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:01.234 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:01.239+0000 7f6a737908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:01.236 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:01.239+0000 7f6a737908c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:01.236 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:01.239+0000 7f6a737908c0 -1 bdev(0x5639f3beac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:20:01.236 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:01.239+0000 7f6a737908c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:20:03.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:20:03.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:20:03.490 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:20:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:20:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:20:03.636 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:20:03.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:20:03.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:03.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:20:03.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:20:03.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:20:03.659 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:03.663+0000 7f2adab1e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:03.660 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:03.667+0000 7f2adab1e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:03.665 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:03.671+0000 7f2adab1e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:03.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:03.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:04.849 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:04.855+0000 7f2adab1e8c0 -1 Falling back to public interface 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:04.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:05.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:05.826 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:05.831+0000 7f2adab1e8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:06.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:06.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:07.334 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:20:07.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:07.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:07.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:20:07.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:07.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:07.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:08.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:08.682 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/453590111,v1:127.0.0.1:6803/453590111] [v2:127.0.0.1:6804/453590111,v1:127.0.0.1:6805/453590111] exists,up 633422ab-f353-4b39-aeaf-6460d696a517 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5768: TEST_periodic_scrub_replicated: run_osd td/osd-scrub-repair 1 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:20:08.683 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0' 2026-03-08T23:20:08.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:20:08.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:20:08.686 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 fa9fb581-8d88-4d32-aa35-0fb824f7ddfd 2026-03-08T23:20:08.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fa9fb581-8d88-4d32-aa35-0fb824f7ddfd 2026-03-08T23:20:08.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 fa9fb581-8d88-4d32-aa35-0fb824f7ddfd' 2026-03-08T23:20:08.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:20:08.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAoBK5pVtMHKhAA04nxfdB0HfCpFiaSZMpV7w== 2026-03-08T23:20:08.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAoBK5pVtMHKhAA04nxfdB0HfCpFiaSZMpV7w=="}' 2026-03-08T23:20:08.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fa9fb581-8d88-4d32-aa35-0fb824f7ddfd -i td/osd-scrub-repair/1/new.json 2026-03-08T23:20:08.852 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:20:08.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:20:08.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --mkfs --key AQAoBK5pVtMHKhAA04nxfdB0HfCpFiaSZMpV7w== --osd-uuid fa9fb581-8d88-4d32-aa35-0fb824f7ddfd 2026-03-08T23:20:08.886 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:08.891+0000 7f7b63f1d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:08.887 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:08.891+0000 7f7b63f1d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:08.888 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:08.891+0000 7f7b63f1d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:08.888 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:08.895+0000 7f7b63f1d8c0 -1 bdev(0x562bf589fc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:20:08.889 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:08.895+0000 7f7b63f1d8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:20:11.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:20:11.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:20:11.134 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:20:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:20:11.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:20:11.360 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:20:11.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:20:11.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:20:11.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:20:11.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:20:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:11.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:11.391+0000 7f448e0108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:11.390 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:11.395+0000 7f448e0108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:11.391 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:11.395+0000 7f448e0108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:11.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:20:11.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:12.597 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:12.603+0000 7f448e0108c0 -1 Falling back to public interface 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:12.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:20:12.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:13.558 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:13.563+0000 7f448e0108c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:20:13.888 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:20:13.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:13.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:13.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:20:13.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:20:13.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:14.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:15.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:20:15.265 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/79372107,v1:127.0.0.1:6811/79372107] [v2:127.0.0.1:6812/79372107,v1:127.0.0.1:6813/79372107] exists,up fa9fb581-8d88-4d32-aa35-0fb824f7ddfd 2026-03-08T23:20:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:20:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:20:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:20:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5769: TEST_periodic_scrub_replicated: create_rbd_pool 2026-03-08T23:20:15.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:20:15.427 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:20:15.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:20:15.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:20:15.661 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:20:15.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:20:16.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:20:17.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5770: TEST_periodic_scrub_replicated: wait_for_clean 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:20:17.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:17.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:17.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:17.644 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:17.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:17.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:17.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:17.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:20:17.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:20:17.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:20:17.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:17.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:17.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:20:17.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:20:17.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:20:17.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:17.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:20:17.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:17.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:17.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:20:17.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:17.818 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:20:17.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:20:17.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:20:17.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:20:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:18.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:18.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:19.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:20:19.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:20.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:20:20.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:20.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836483 2026-03-08T23:20:20.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:20.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:20:20.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:20.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:20.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:20:20.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:20.314 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:20:20.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:20:20.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:20:20.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:20.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:20:20.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:20:20.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:20.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:20.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:20:20.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:20:20.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:20:20.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:20:20.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:20:20.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:20.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:21.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:20:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:20:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:20:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5772: TEST_periodic_scrub_replicated: create_pool psr_pool 1 1 2026-03-08T23:20:21.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool 1 1 2026-03-08T23:20:21.255 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool' created 2026-03-08T23:20:21.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5773: TEST_periodic_scrub_replicated: wait_for_clean 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:20:22.273 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:20:22.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:20:22.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:20:22.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:20:22.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:20:22.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:20:22.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:20:22.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:20:22.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:22.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:22.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:22.504 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:22.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:22.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:22.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:22.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:20:22.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:20:22.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:20:22.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:22.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672964 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672964 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672964' 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:20:22.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:22.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:22.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:20:22.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:22.659 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:20:22.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:20:22.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:20:22.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:22.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:20:22.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:23.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:23.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:23.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836485 2026-03-08T23:20:23.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:23.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672964 2026-03-08T23:20:23.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:23.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:23.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672964 2026-03-08T23:20:23.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:23.984 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672964 2026-03-08T23:20:23.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672964 2026-03-08T23:20:23.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672964' 2026-03-08T23:20:23.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672964 -lt 42949672964 2026-03-08T23:20:24.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:20:24.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:24.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:20:24.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:20:24.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:20:24.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:20:24.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:24.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5775: TEST_periodic_scrub_replicated: local osd=0 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5776: TEST_periodic_scrub_replicated: add_something td/osd-scrub-repair psr_pool POBJ scrub 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=psr_pool 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=POBJ 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=scrub 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' scrub = noscrub ']' 2026-03-08T23:20:24.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:81: add_something: ceph osd unset noscrub 2026-03-08T23:20:24.930 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T23:20:24.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:82: add_something: ceph osd unset nodeep-scrub 2026-03-08T23:20:25.136 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-08T23:20:25.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:20:25.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:20:25.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool psr_pool put POBJ td/osd-scrub-repair/ORIGINAL 2026-03-08T23:20:25.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5777: TEST_periodic_scrub_replicated: get_primary psr_pool POBJ 2026-03-08T23:20:25.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=psr_pool 2026-03-08T23:20:25.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=POBJ 2026-03-08T23:20:25.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map psr_pool POBJ 2026-03-08T23:20:25.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5777: TEST_periodic_scrub_replicated: local primary=0 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5778: TEST_periodic_scrub_replicated: get_pg psr_pool POBJ 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=psr_pool 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=POBJ 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map psr_pool POBJ 2026-03-08T23:20:25.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:20:25.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5778: TEST_periodic_scrub_replicated: local pg=2.0 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5781: TEST_periodic_scrub_replicated: local payload=UVWXYZ 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5782: TEST_periodic_scrub_replicated: echo UVWXYZ 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5784: TEST_periodic_scrub_replicated: objectstore_tool td/osd-scrub-repair 0 POBJ set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 POBJ set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:20:25.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 POBJ set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:20:25.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 POBJ set-bytes td/osd-scrub-repair/CORRUPT 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:20:26.816 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:20:26.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:20:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:20:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0' 2026-03-08T23:20:26.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:20:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:20:26.819 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:20:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:20:26.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:20:26.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:20:26.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:20:26.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:20:26.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:20:26.836 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:26.839+0000 7f80dfe158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:26.836 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:26.843+0000 7f80dfe158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:26.838 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:26.843+0000 7f80dfe158c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:20:26.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:26.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:27.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:28.041 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:28.047+0000 7f80dfe158c0 -1 Falling back to public interface 2026-03-08T23:20:28.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:28.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:28.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:20:28.162 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:20:28.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:28.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:28.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:29.020 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:20:29.027+0000 7f80dfe158c0 -1 osd.0 22 log_to_monitors true 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:29.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:29.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:20:30.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 26 up_thru 26 down_at 23 last_clean_interval [5,22) [v2:127.0.0.1:6802/1323308906,v1:127.0.0.1:6803/1323308906] [v2:127.0.0.1:6804/1323308906,v1:127.0.0.1:6805/1323308906] exists,up 633422ab-f353-4b39-aeaf-6460d696a517 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:20:30.671 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:20:30.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:20:30.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:20:30.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:30.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:30.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:30.907 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:30.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:30.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:30.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149698 2026-03-08T23:20:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149698 2026-03-08T23:20:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149698' 2026-03-08T23:20:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:30.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672967 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672967 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149698 1-42949672967' 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149698 2026-03-08T23:20:31.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:31.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:31.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149698 2026-03-08T23:20:31.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:31.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149698 2026-03-08T23:20:31.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149698' 2026-03-08T23:20:31.079 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 111669149698 2026-03-08T23:20:31.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:31.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 111669149698 2026-03-08T23:20:31.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:32.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:32.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:32.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149698 -lt 111669149698 2026-03-08T23:20:32.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:32.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672967 2026-03-08T23:20:32.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:32.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:32.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672967 2026-03-08T23:20:32.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672967 2026-03-08T23:20:32.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672967' 2026-03-08T23:20:32.418 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672967 2026-03-08T23:20:32.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:32.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672967 -lt 42949672967 2026-03-08T23:20:32.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:20:32.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:32.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:32.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:20:32.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:20:32.794 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:20:32.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:20:32.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:20:32.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:20:32.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5787: TEST_periodic_scrub_replicated: set -o pipefail 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5788: TEST_periodic_scrub_replicated: rados list-inconsistent-obj 2.0 2026-03-08T23:20:33.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5788: TEST_periodic_scrub_replicated: jq . 2026-03-08T23:20:33.186 INFO:tasks.workunit.client.0.vm03.stderr:No scrub information available for pg 2.0 2026-03-08T23:20:33.186 INFO:tasks.workunit.client.0.vm03.stderr:error 2: (2) No such file or directory 2026-03-08T23:20:33.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5789: TEST_periodic_scrub_replicated: set +o pipefail 2026-03-08T23:20:33.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5791: TEST_periodic_scrub_replicated: pg_deep_scrub 2.0 2026-03-08T23:20:33.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=2.0 2026-03-08T23:20:33.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 2.0 2026-03-08T23:20:33.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=2.0 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:20:33.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:33.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:33.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:33.517 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:33.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:33.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:33.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:33.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149699 2026-03-08T23:20:33.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149699 2026-03-08T23:20:33.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149699' 2026-03-08T23:20:33.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:33.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149699 1-42949672968' 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149699 2026-03-08T23:20:33.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:33.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:33.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149699 2026-03-08T23:20:33.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:33.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149699 2026-03-08T23:20:33.677 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149699 2026-03-08T23:20:33.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149699' 2026-03-08T23:20:33.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:33.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149698 -lt 111669149699 2026-03-08T23:20:33.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:34.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:34.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149698 -lt 111669149699 2026-03-08T23:20:35.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:36.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:20:36.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149700 -lt 111669149699 2026-03-08T23:20:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:36.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T23:20:36.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:36.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:36.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T23:20:36.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:36.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T23:20:36.176 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672968 2026-03-08T23:20:36.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T23:20:36.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:36.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672968 -lt 42949672968 2026-03-08T23:20:36.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:20:36.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 2.0 loop 0' 2026-03-08T23:20:36.336 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 2.0 loop 0 2026-03-08T23:20:36.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 2.0 2026-03-08T23:20:36.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=2.0 2026-03-08T23:20:36.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:20:36.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 2.0 query 2026-03-08T23:20:36.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:20:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:20:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:20:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:20:36.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:20:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 2.0 last_deep_scrub_stamp 2026-03-08T23:20:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:20:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:36.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_deep_scrub_stamp' 2026-03-08T23:20:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 2.0 2026-03-08T23:20:36.728 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0 on osd.0 to deep-scrub 2026-03-08T23:20:36.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 2.0 2026-03-08T23:20:21.261747+0000 last_deep_scrub_stamp 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_deep_scrub_stamp 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:36.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_deep_scrub_stamp' 2026-03-08T23:20:36.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:21.261747+0000 '>' 2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:36.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:37.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_deep_scrub_stamp 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:37.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_deep_scrub_stamp' 2026-03-08T23:20:38.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:21.261747+0000 '>' 2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:38.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_deep_scrub_stamp 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:39.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_deep_scrub_stamp' 2026-03-08T23:20:39.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:21.261747+0000 '>' 2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:39.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_deep_scrub_stamp 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:20:40.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:40.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_deep_scrub_stamp' 2026-03-08T23:20:40.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:36.915273+0000 '>' 2026-03-08T23:20:21.261747+0000 2026-03-08T23:20:40.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:20:40.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5794: TEST_periodic_scrub_replicated: rados list-inconsistent-obj 2.0 2026-03-08T23:20:40.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5794: TEST_periodic_scrub_replicated: grep -q POBJ 2026-03-08T23:20:40.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5794: TEST_periodic_scrub_replicated: jq . 2026-03-08T23:20:40.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5796: TEST_periodic_scrub_replicated: flush_pg_stats 2026-03-08T23:20:40.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:40.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:40.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:40.597 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:40.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:40.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:40.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:40.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149702 2026-03-08T23:20:40.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149702 2026-03-08T23:20:40.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149702' 2026-03-08T23:20:40.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:40.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672971 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672971 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149702 1-42949672971' 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149702 2026-03-08T23:20:40.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:40.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:40.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149702 2026-03-08T23:20:40.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:40.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149702 2026-03-08T23:20:40.758 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149702 2026-03-08T23:20:40.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149702' 2026-03-08T23:20:40.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:40.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149701 -lt 111669149702 2026-03-08T23:20:40.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:41.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:41.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:42.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149702 -lt 111669149702 2026-03-08T23:20:42.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:42.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672971 2026-03-08T23:20:42.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:42.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:42.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672971 2026-03-08T23:20:42.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:42.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672971 2026-03-08T23:20:42.086 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672971 2026-03-08T23:20:42.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672971' 2026-03-08T23:20:42.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:42.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672971 -lt 42949672971 2026-03-08T23:20:42.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5797: TEST_periodic_scrub_replicated: get_last_scrub_stamp 2.0 2026-03-08T23:20:42.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:42.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:42.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:42.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:42.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5797: TEST_periodic_scrub_replicated: local last_scrub=2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:42.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5799: TEST_periodic_scrub_replicated: ceph tell 2.0 schedule-scrub 2026-03-08T23:20:42.493 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:20:42.493 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:20:42.493 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:20:42.493 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:19:02.501322+0000" 2026-03-08T23:20:42.493 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:20:42.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5801: TEST_periodic_scrub_replicated: wait_for_scrub 2.0 2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:42.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:20:42.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:42.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:42.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:42.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:36.915273+0000 '>' 2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:42.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:43.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:43.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:36.915273+0000 '>' 2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:43.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:44.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:44.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:44.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:44.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:44.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:44.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:44.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:45.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:36.915273+0000 '>' 2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:45.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:46.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:46.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:42.949425+0000 '>' 2026-03-08T23:20:36.915273+0000 2026-03-08T23:20:46.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:20:46.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5804: TEST_periodic_scrub_replicated: grep -q 'Deep scrub errors, upgrading scrub to deep-scrub' td/osd-scrub-repair/osd.0.log 2026-03-08T23:20:46.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5807: TEST_periodic_scrub_replicated: rados list-inconsistent-obj 2.0 2026-03-08T23:20:46.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5807: TEST_periodic_scrub_replicated: grep -q POBJ 2026-03-08T23:20:46.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5807: TEST_periodic_scrub_replicated: jq . 2026-03-08T23:20:46.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5810: TEST_periodic_scrub_replicated: ceph osd set nodeep-scrub 2026-03-08T23:20:46.408 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:20:46.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5812: TEST_periodic_scrub_replicated: ceph tell osd.0 get_latest_osdmap 2026-03-08T23:20:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5813: TEST_periodic_scrub_replicated: flush_pg_stats 2026-03-08T23:20:46.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:46.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:46.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:46.663 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:46.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:46.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:46.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:46.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149704 2026-03-08T23:20:46.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149704 2026-03-08T23:20:46.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149704' 2026-03-08T23:20:46.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:46.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672973 2026-03-08T23:20:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672973 2026-03-08T23:20:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149704 1-42949672973' 2026-03-08T23:20:46.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:46.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149704 2026-03-08T23:20:46.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:46.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:46.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149704 2026-03-08T23:20:46.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:46.826 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149704 2026-03-08T23:20:46.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149704 2026-03-08T23:20:46.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149704' 2026-03-08T23:20:46.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:46.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149703 -lt 111669149704 2026-03-08T23:20:46.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:47.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:47.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:48.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149704 -lt 111669149704 2026-03-08T23:20:48.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:48.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672973 2026-03-08T23:20:48.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:48.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:48.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672973 2026-03-08T23:20:48.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:48.194 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672973 2026-03-08T23:20:48.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672973 2026-03-08T23:20:48.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672973' 2026-03-08T23:20:48.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:48.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672973 -lt 42949672973 2026-03-08T23:20:48.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5814: TEST_periodic_scrub_replicated: sleep 5 2026-03-08T23:20:53.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5817: TEST_periodic_scrub_replicated: ceph tell 2.0 schedule-scrub 2026-03-08T23:20:53.455 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:20:53.455 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:20:53.455 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:20:53.455 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:19:13.462696+0000" 2026-03-08T23:20:53.455 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:20:53.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5820: TEST_periodic_scrub_replicated: local found=false 2026-03-08T23:20:53.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5821: TEST_periodic_scrub_replicated: seq 14 -1 0 2026-03-08T23:20:53.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5821: TEST_periodic_scrub_replicated: for i in $(seq 14 -1 0) 2026-03-08T23:20:53.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5823: TEST_periodic_scrub_replicated: sleep 1 2026-03-08T23:20:54.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5824: TEST_periodic_scrub_replicated: grep -q 'Regular scrub skipped due to deep-scrub errors and nodeep-scrub set' td/osd-scrub-repair/osd.0.log 2026-03-08T23:20:54.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5824: TEST_periodic_scrub_replicated: found=true 2026-03-08T23:20:54.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5824: TEST_periodic_scrub_replicated: break 2026-03-08T23:20:54.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5827: TEST_periodic_scrub_replicated: test true = true 2026-03-08T23:20:54.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5830: TEST_periodic_scrub_replicated: rados list-inconsistent-obj 2.0 2026-03-08T23:20:54.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5830: TEST_periodic_scrub_replicated: grep -q POBJ 2026-03-08T23:20:54.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5830: TEST_periodic_scrub_replicated: jq . 2026-03-08T23:20:54.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5832: TEST_periodic_scrub_replicated: flush_pg_stats 2026-03-08T23:20:54.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:54.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:54.656 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:54.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:54.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:54.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149707 2026-03-08T23:20:54.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149707 2026-03-08T23:20:54.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149707' 2026-03-08T23:20:54.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:54.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672975 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672975 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149707 1-42949672975' 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149707 2026-03-08T23:20:54.814 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:54.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:54.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149707 2026-03-08T23:20:54.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:54.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149707 2026-03-08T23:20:54.816 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149707 2026-03-08T23:20:54.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149707' 2026-03-08T23:20:54.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:54.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149705 -lt 111669149707 2026-03-08T23:20:54.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:55.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:56.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149707 -lt 111669149707 2026-03-08T23:20:56.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:56.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672975 2026-03-08T23:20:56.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:56.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:56.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672975 2026-03-08T23:20:56.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:56.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672975 2026-03-08T23:20:56.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672975' 2026-03-08T23:20:56.144 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672975 2026-03-08T23:20:56.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:56.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672975 -lt 42949672975 2026-03-08T23:20:56.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5834: TEST_periodic_scrub_replicated: pg_scrub 2.0 2026-03-08T23:20:56.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=2.0 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 2.0 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=2.0 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:20:56.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:20:56.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:20:56.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:20:56.620 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:20:56.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:20:56.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:56.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:20:56.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149708 2026-03-08T23:20:56.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149708 2026-03-08T23:20:56.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149708' 2026-03-08T23:20:56.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:20:56.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:20:56.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672977 2026-03-08T23:20:56.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672977 2026-03-08T23:20:56.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149708 1-42949672977' 2026-03-08T23:20:56.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:56.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149708 2026-03-08T23:20:56.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:56.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:20:56.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149708 2026-03-08T23:20:56.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:56.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149708 2026-03-08T23:20:56.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149708' 2026-03-08T23:20:56.784 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149708 2026-03-08T23:20:56.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:56.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149707 -lt 111669149708 2026-03-08T23:20:56.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:20:57.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:20:57.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:20:58.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149708 -lt 111669149708 2026-03-08T23:20:58.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:20:58.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:20:58.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672977 2026-03-08T23:20:58.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:20:58.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672977 2026-03-08T23:20:58.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:20:58.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672977 2026-03-08T23:20:58.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672977' 2026-03-08T23:20:58.120 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672977 2026-03-08T23:20:58.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672977 -lt 42949672977 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 2.0 loop 0' 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 2.0 loop 0 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 2.0 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=2.0 2026-03-08T23:20:58.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:20:58.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 2.0 query 2026-03-08T23:20:58.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean+inconsistent 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean+inconsistent == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 2.0 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:58.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:58.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:58.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-01T23:19:13.462696+0000 2026-03-08T23:20:58.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 2.0 2026-03-08T23:20:58.692 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 2.0 on osd.0 to scrub 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 2.0 2026-03-01T23:19:13.462696+0000 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=2.0 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-01T23:19:13.462696+0000 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:58.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:20:58.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:19:13.462696+0000 '>' 2026-03-01T23:19:13.462696+0000 2026-03-08T23:20:58.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:20:59.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:20:59.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:20:59.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 2.0 last_scrub_stamp 2026-03-08T23:20:59.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=2.0 2026-03-08T23:20:59.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:20:59.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:20:59.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="2.0") | .last_scrub_stamp' 2026-03-08T23:21:00.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:20:58.880015+0000 '>' 2026-03-01T23:19:13.462696+0000 2026-03-08T23:21:00.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:21:00.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5835: TEST_periodic_scrub_replicated: grep -q 'Regular scrub request, deep-scrub details will be lost' td/osd-scrub-repair/osd.0.log 2026-03-08T23:21:00.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5838: TEST_periodic_scrub_replicated: rados list-inconsistent-obj 2.0 2026-03-08T23:21:00.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5838: TEST_periodic_scrub_replicated: grep -qv POBJ 2026-03-08T23:21:00.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5838: TEST_periodic_scrub_replicated: jq . 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:21:00.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:21:00.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:21:00.198 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:21:00.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:21:00.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:21:00.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:21:00.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:21:00.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:21:00.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:21:00.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:21:00.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:21:00.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:21:00.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:21:00.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:21:00.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:21:00.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:21:00.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:21:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:21:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:21:00.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:21:00.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:21:00.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:21:00.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:21:00.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:21:00.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:21:00.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:21:00.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:21:00.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:21:00.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:21:00.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:21:00.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:21:00.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:21:00.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:21:00.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:21:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:21:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:21:00.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:21:00.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:21:00.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_repair_stats td/osd-scrub-repair 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:654: TEST_repair_stats: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:655: TEST_repair_stats: local poolname=testpool 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:656: TEST_repair_stats: local OSDS=2 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:657: TEST_repair_stats: local OBJS=30 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:659: TEST_repair_stats: local REPAIRS=20 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:662: TEST_repair_stats: run_mon td/osd-scrub-repair a 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:21:00.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:21:00.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:21:00.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:00.273 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:00.274 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:00.274 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.274 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:00.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:21:00.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:21:00.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:21:00.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:21:00.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:21:00.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:21:00.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:21:00.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:21:00.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:21:00.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:21:00.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:21:00.397 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:21:00.398 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:21:00.398 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.398 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.399 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:21:00.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:21:00.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:663: TEST_repair_stats: run_mgr td/osd-scrub-repair x 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:21:00.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:00.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:21:00.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:21:00.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:665: TEST_repair_stats: local 'ceph_osd_args=--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:21:00.594 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:666: TEST_repair_stats: expr 2 - 1 2026-03-08T23:21:00.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:666: TEST_repair_stats: seq 0 1 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:666: TEST_repair_stats: for id in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:667: TEST_repair_stats: run_osd td/osd-scrub-repair 0 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:21:00.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:21:00.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:21:00.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:21:00.605 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 24294bc8-a770-45cb-bc16-354ac54507cd 2026-03-08T23:21:00.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=24294bc8-a770-45cb-bc16-354ac54507cd 2026-03-08T23:21:00.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 24294bc8-a770-45cb-bc16-354ac54507cd' 2026-03-08T23:21:00.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:21:00.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBcBK5p2jcsJRAAI8UCmNn7aB/Mxv8fziHoxg== 2026-03-08T23:21:00.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBcBK5p2jcsJRAAI8UCmNn7aB/Mxv8fziHoxg=="}' 2026-03-08T23:21:00.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 24294bc8-a770-45cb-bc16-354ac54507cd -i td/osd-scrub-repair/0/new.json 2026-03-08T23:21:00.715 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:21:00.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:21:00.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBcBK5p2jcsJRAAI8UCmNn7aB/Mxv8fziHoxg== --osd-uuid 24294bc8-a770-45cb-bc16-354ac54507cd 2026-03-08T23:21:00.744 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:00.747+0000 7f50871d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:00.746 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:00.751+0000 7f50871d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:00.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:00.751+0000 7f50871d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:00.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:00.751+0000 7f50871d88c0 -1 bdev(0x564959152c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:21:00.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:00.751+0000 7f50871d88c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:21:03.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:21:03.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:21:03.025 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:21:03.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:21:03.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:21:03.150 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:21:03.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:21:03.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:03.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:21:03.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:21:03.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:21:03.204 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:03.203+0000 7fa33ce548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:03.217 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:03.223+0000 7fa33ce548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:03.219 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:03.223+0000 7fa33ce548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:03.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:03.677 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:03.683+0000 7fa33ce548c0 -1 Falling back to public interface 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:04.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:04.651 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:04.655+0000 7fa33ce548c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:21:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:05.636 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:05.639+0000 7fa33860d640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:21:05.658 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:21:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:21:05.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:05.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/4003246976,v1:127.0.0.1:6803/4003246976] [v2:127.0.0.1:6804/4003246976,v1:127.0.0.1:6805/4003246976] exists,up 24294bc8-a770-45cb-bc16-354ac54507cd 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:666: TEST_repair_stats: for id in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:667: TEST_repair_stats: run_osd td/osd-scrub-repair 1 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:05.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:05.826 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:21:05.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:21:05.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:21:05.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4aade09a-17c1-4ce9-9df5-1fbedce31c27 2026-03-08T23:21:05.829 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 4aade09a-17c1-4ce9-9df5-1fbedce31c27 2026-03-08T23:21:05.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 4aade09a-17c1-4ce9-9df5-1fbedce31c27' 2026-03-08T23:21:05.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:21:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBhBK5piBGHMhAAuL06dV2VA/PW/41iwsxhHQ== 2026-03-08T23:21:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBhBK5piBGHMhAAuL06dV2VA/PW/41iwsxhHQ=="}' 2026-03-08T23:21:05.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4aade09a-17c1-4ce9-9df5-1fbedce31c27 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:21:06.002 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:21:06.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:21:06.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQBhBK5piBGHMhAAuL06dV2VA/PW/41iwsxhHQ== --osd-uuid 4aade09a-17c1-4ce9-9df5-1fbedce31c27 2026-03-08T23:21:06.033 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:06.039+0000 7fb8da62a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:06.035 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:06.039+0000 7fb8da62a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:06.036 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:06.039+0000 7fb8da62a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:06.036 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:06.043+0000 7fb8da62a8c0 -1 bdev(0x55a750f1dc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:21:06.036 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:06.043+0000 7fb8da62a8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:21:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:21:08.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:21:08.794 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:21:08.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:21:08.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:21:08.995 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:21:08.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:21:08.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:08.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:21:08.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:21:09.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:21:09.011 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:09.015+0000 7fd0312558c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:09.011 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:09.015+0000 7fd0312558c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:09.013 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:09.019+0000 7fd0312558c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:09.165 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:21:09.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:21:09.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:21:09.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:21:09.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:21:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:21:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:21:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:09.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:09.965 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:09.971+0000 7fd0312558c0 -1 Falling back to public interface 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:10.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:10.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:10.939 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:10.943+0000 7fd0312558c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:11.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:11.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:11.985 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:11.991+0000 7fd02ca0e640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:12.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3687859168,v1:127.0.0.1:6811/3687859168] [v2:127.0.0.1:6812/3687859168,v1:127.0.0.1:6813/3687859168] exists,up 4aade09a-17c1-4ce9-9df5-1fbedce31c27 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:670: TEST_repair_stats: create_pool testpool 1 1 2026-03-08T23:21:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 2026-03-08T23:21:13.091 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T23:21:13.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:21:14.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:671: TEST_repair_stats: ceph osd pool set testpool size 2 2026-03-08T23:21:14.311 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 size to 2 2026-03-08T23:21:14.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:672: TEST_repair_stats: wait_for_clean 2026-03-08T23:21:14.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:21:14.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:21:14.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:21:14.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:21:14.329 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:21:14.329 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:21:14.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:21:14.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:21:14.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:21:14.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:21:14.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:21:14.561 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:21:14.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:21:14.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:14.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:21:14.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:21:14.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:21:14.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:21:14.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:14.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:21:14.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:14.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:21:14.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:21:14.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:14.722 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:21:14.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:21:14.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:21:14.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:21:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:21:15.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:21:15.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:16.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836483 2026-03-08T23:21:16.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:16.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:21:16.046 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:16.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:21:16.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:21:16.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:16.049 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:21:16.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:21:16.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:21:16.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:21:16.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:21:16.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:21:16.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:16.210 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:21:16.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:21:16.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:21:16.568 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:21:16.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:16.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:675: TEST_repair_stats: local payload=ABCDEF 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:676: TEST_repair_stats: echo ABCDEF 2026-03-08T23:21:16.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: seq 1 30 2026-03-08T23:21:16.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj3 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj4 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj5 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj6 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj7 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj8 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj9 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj10 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj11 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:16.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:16.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj12 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj13 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj14 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj15 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj16 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj17 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj18 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj19 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj20 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj21 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj22 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj23 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj24 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj25 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj26 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj27 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj28 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj29 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:677: TEST_repair_stats: for i in $(seq 1 $OBJS) 2026-03-08T23:21:17.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:679: TEST_repair_stats: rados --pool testpool put obj30 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:684: TEST_repair_stats: get_not_primary testpool obj1 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool obj1 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T23:21:17.401 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:21:17.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:21:17.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool obj1 2026-03-08T23:21:17.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:21:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:684: TEST_repair_stats: local other=0 2026-03-08T23:21:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:685: TEST_repair_stats: get_pg testpool obj1 2026-03-08T23:21:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T23:21:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=obj1 2026-03-08T23:21:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool obj1 2026-03-08T23:21:17.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:685: TEST_repair_stats: local pgid=1.0 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:686: TEST_repair_stats: get_primary testpool obj1 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T23:21:17.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:686: TEST_repair_stats: local primary=1 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:688: TEST_repair_stats: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:21:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:21:18.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:21:18.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:689: TEST_repair_stats: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:21:18.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:21:18.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:21:18.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:21:18.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:21:18.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:21:18.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:21:18.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: seq 1 20 2026-03-08T23:21:18.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:18.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 1 % 2 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj1 remove 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:18.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj1 remove 2026-03-08T23:21:18.938 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T23:21:19.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:19.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 2 % 2 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj2 remove 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:19.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj2 remove 2026-03-08T23:21:20.126 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:104778fc:::obj2:head# 2026-03-08T23:21:20.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:20.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 3 % 2 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj3 remove 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:20.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj3 remove 2026-03-08T23:21:21.290 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:8dd16f86:::obj3:head# 2026-03-08T23:21:21.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:21.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 4 % 2 2026-03-08T23:21:21.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj4 remove 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj4 remove 2026-03-08T23:21:22.456 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:0ee9ae15:::obj4:head# 2026-03-08T23:21:22.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:22.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 5 % 2 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj5 remove 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:22.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj5 remove 2026-03-08T23:21:23.620 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:head# 2026-03-08T23:21:24.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:24.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 6 % 2 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj6 remove 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:24.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj6 remove 2026-03-08T23:21:24.780 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:233a42c1:::obj6:head# 2026-03-08T23:21:25.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:25.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 7 % 2 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj7 remove 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj7 remove 2026-03-08T23:21:25.958 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:9a09113a:::obj7:head# 2026-03-08T23:21:26.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:26.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 8 % 2 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj8 remove 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:26.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj8 remove 2026-03-08T23:21:27.122 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:a557efb1:::obj8:head# 2026-03-08T23:21:27.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:27.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 9 % 2 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj9 remove 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:27.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj9 remove 2026-03-08T23:21:28.275 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:fe351e27:::obj9:head# 2026-03-08T23:21:28.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:28.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 10 % 2 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj10 remove 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:28.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj10 remove 2026-03-08T23:21:29.437 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:d1337354:::obj10:head# 2026-03-08T23:21:29.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:29.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 11 % 2 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj11 remove 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:29.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj11 remove 2026-03-08T23:21:30.593 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:eeae2a94:::obj11:head# 2026-03-08T23:21:31.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:31.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 12 % 2 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj12 remove 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:31.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj12 remove 2026-03-08T23:21:31.757 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:af5f95cb:::obj12:head# 2026-03-08T23:21:32.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:32.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 13 % 2 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj13 remove 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:32.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj13 remove 2026-03-08T23:21:32.921 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:a61ad63e:::obj13:head# 2026-03-08T23:21:33.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:33.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 14 % 2 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj14 remove 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:33.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj14 remove 2026-03-08T23:21:34.088 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:e0a44829:::obj14:head# 2026-03-08T23:21:34.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:34.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 15 % 2 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj15 remove 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:34.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj15 remove 2026-03-08T23:21:35.505 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ab946124:::obj15:head# 2026-03-08T23:21:36.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:36.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 16 % 2 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj16 remove 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:36.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj16 remove 2026-03-08T23:21:36.913 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:461f8b5e:::obj16:head# 2026-03-08T23:21:37.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:37.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 17 % 2 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj17 remove 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:37.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj17 remove 2026-03-08T23:21:38.078 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:24fc6e92:::obj17:head# 2026-03-08T23:21:38.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:38.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 18 % 2 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj18 remove 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:38.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:38.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj18 remove 2026-03-08T23:21:39.246 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:b9836a99:::obj18:head# 2026-03-08T23:21:39.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:39.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 19 % 2 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=1 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 1 obj19 remove 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:39.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj19 remove 2026-03-08T23:21:40.404 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:00b31b6c:::obj19:head# 2026-03-08T23:21:40.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:690: TEST_repair_stats: for i in $(seq 1 $REPAIRS) 2026-03-08T23:21:40.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: expr 20 % 2 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:693: TEST_repair_stats: OSD=0 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:694: TEST_repair_stats: _objectstore_tool_nodown td/osd-scrub-repair 0 obj20 remove 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:21:40.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:21:40.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:40.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj20 remove 2026-03-08T23:21:41.814 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:24b5611b:::obj20:head# 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:696: TEST_repair_stats: activate_osd td/osd-scrub-repair 1 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:42.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:21:42.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:21:42.350 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:21:42.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:21:42.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:42.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:21:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:21:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:21:42.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:21:42.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:21:42.366 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:42.372+0000 7f951ec448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:42.367 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:42.372+0000 7f951ec448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:42.369 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:42.372+0000 7f951ec448c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:42.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:43.321 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:43.328+0000 7f951ec448c0 -1 Falling back to public interface 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:43.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:43.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:44.294 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:44.300+0000 7f951ec448c0 -1 osd.1 17 log_to_monitors true 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:44.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:45.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:45.143 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:45.148+0000 7f9515bf4640 -1 osd.1 17 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:21:46.037 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:21:46.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:46.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:46.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:21:46.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:46.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 20 up_thru 20 down_at 18 last_clean_interval [10,17) [v2:127.0.0.1:6802/3484637424,v1:127.0.0.1:6803/3484637424] [v2:127.0.0.1:6804/3484637424,v1:127.0.0.1:6805/3484637424] exists,up 4aade09a-17c1-4ce9-9df5-1fbedce31c27 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:697: TEST_repair_stats: activate_osd td/osd-scrub-repair 0 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:21:46.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:21:46.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:21:46.204 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:21:46.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:21:46.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:21:46.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:21:46.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:21:46.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:21:46.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:21:46.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:21:46.221 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:46.224+0000 7ff1782808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:46.221 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:46.228+0000 7ff1782808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:46.222 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:46.228+0000 7ff1782808c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:46.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:46.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:46.669 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:46.676+0000 7ff1782808c0 -1 Falling back to public interface 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:47.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:47.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:21:47.644+0000 7ff1782808c0 -1 osd.0 16 log_to_monitors true 2026-03-08T23:21:47.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:48.725 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:21:48.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:48.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:48.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:21:48.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:48.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:48.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:21:49.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 23 up_thru 0 down_at 17 last_clean_interval [5,16) [v2:127.0.0.1:6810/1461155101,v1:127.0.0.1:6811/1461155101] [v2:127.0.0.1:6812/1461155101,v1:127.0.0.1:6813/1461155101] exists,up 24294bc8-a770-45cb-bc16-354ac54507cd 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:698: TEST_repair_stats: wait_for_clean 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:21:50.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:21:50.077 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:21:50.077 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:21:50.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:21:50.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:21:50.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:21:50.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:21:50.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:21:50.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:21:50.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:21:50.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:21:50.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:21:50.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:21:50.308 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:21:50.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:21:50.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:50.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:21:50.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247810 2026-03-08T23:21:50.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247810 2026-03-08T23:21:50.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247810' 2026-03-08T23:21:50.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:50.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:21:50.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345923 2026-03-08T23:21:50.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345923 2026-03-08T23:21:50.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247810 1-85899345923' 2026-03-08T23:21:50.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:50.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-98784247810 2026-03-08T23:21:50.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:50.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:21:50.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-98784247810 2026-03-08T23:21:50.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:50.469 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 98784247810 2026-03-08T23:21:50.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247810 2026-03-08T23:21:50.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 98784247810' 2026-03-08T23:21:50.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:50.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 98784247810 2026-03-08T23:21:50.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:21:51.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:21:51.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:51.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247810 -lt 98784247810 2026-03-08T23:21:51.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:51.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-85899345923 2026-03-08T23:21:51.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:51.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:21:51.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-85899345923 2026-03-08T23:21:51.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:51.827 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 85899345923 2026-03-08T23:21:51.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345923 2026-03-08T23:21:51.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 85899345923' 2026-03-08T23:21:51.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:21:51.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345923 -lt 85899345923 2026-03-08T23:21:51.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:21:51.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:51.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:21:52.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:21:52.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:21:52.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:21:52.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:52.353 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:700: TEST_repair_stats: repair 1.0 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:21:52.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:21:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:52.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T23:21:52.863 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to repair 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:21:52.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:21:52.876 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:21:53.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:21:13.096007+0000 '>' 2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:53.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:21:54.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:21:54.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:21:54.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:21:54.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:21:54.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:21:54.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:21:54.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:21:54.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:21:13.096007+0000 '>' 2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:54.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:21:55.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:21:55.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:21:55.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:21:55.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:21:55.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:21:55.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:21:55.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:21:53.255434+0000 '>' 2026-03-08T23:21:13.096007+0000 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:701: TEST_repair_stats: wait_for_clean 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:21:55.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:21:55.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:21:55.364 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:21:55.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:21:55.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:21:55.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:21:55.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:21:55.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:21:55.611 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:21:55.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:21:55.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:55.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:21:55.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247812 2026-03-08T23:21:55.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247812 2026-03-08T23:21:55.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247812' 2026-03-08T23:21:55.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:55.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:21:55.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345925 2026-03-08T23:21:55.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345925 2026-03-08T23:21:55.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247812 1-85899345925' 2026-03-08T23:21:55.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:55.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-98784247812 2026-03-08T23:21:55.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:55.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:21:55.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-98784247812 2026-03-08T23:21:55.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:55.765 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 98784247812 2026-03-08T23:21:55.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247812 2026-03-08T23:21:55.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 98784247812' 2026-03-08T23:21:55.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:55.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247811 -lt 98784247812 2026-03-08T23:21:55.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:21:56.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:21:56.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:57.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247811 -lt 98784247812 2026-03-08T23:21:57.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:21:58.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:21:58.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:58.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247812 -lt 98784247812 2026-03-08T23:21:58.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:58.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-85899345925 2026-03-08T23:21:58.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:58.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:21:58.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-85899345925 2026-03-08T23:21:58.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:58.254 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 85899345925 2026-03-08T23:21:58.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345925 2026-03-08T23:21:58.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 85899345925' 2026-03-08T23:21:58.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:21:58.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345925 -lt 85899345925 2026-03-08T23:21:58.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:21:58.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:58.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:21:58.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:21:58.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:21:58.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:21:58.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:21:58.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:21:58.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:21:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:21:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:21:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:21:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:702: TEST_repair_stats: ceph pg dump pgs 2026-03-08T23:21:59.133 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T23:21:59.133 INFO:tasks.workunit.client.0.vm03.stdout:1.0 30 0 0 0 0 210 0 0 30 0 30 active+clean 2026-03-08T23:21:53.277423+0000 16'30 24:119 [1,0] 1 [1,0] 1 16'30 2026-03-08T23:21:53.255434+0000 16'30 2026-03-08T23:21:53.255434+0000 0 1 periodic scrub scheduled @ 2026-03-09T23:21:53.255434+0000 20 0 2026-03-08T23:21:59.133 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T23:21:59.133 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T23:21:59.133 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:21:59.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:703: TEST_repair_stats: flush_pg_stats 2026-03-08T23:21:59.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:21:59.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:21:59.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:21:59.304 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:21:59.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:21:59.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:59.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:21:59.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247814 2026-03-08T23:21:59.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247814 2026-03-08T23:21:59.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247814' 2026-03-08T23:21:59.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:21:59.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:21:59.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345927 2026-03-08T23:21:59.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345927 2026-03-08T23:21:59.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-98784247814 1-85899345927' 2026-03-08T23:21:59.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:21:59.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-98784247814 2026-03-08T23:21:59.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:21:59.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:21:59.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-98784247814 2026-03-08T23:21:59.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:21:59.453 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 98784247814 2026-03-08T23:21:59.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247814 2026-03-08T23:21:59.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 98784247814' 2026-03-08T23:21:59.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:21:59.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247813 -lt 98784247814 2026-03-08T23:21:59.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:22:00.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:22:00.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:22:00.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247813 -lt 98784247814 2026-03-08T23:22:00.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:22:01.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:22:01.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:22:01.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247814 -lt 98784247814 2026-03-08T23:22:01.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:22:01.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-85899345927 2026-03-08T23:22:01.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:22:01.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:22:01.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-85899345927 2026-03-08T23:22:01.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:22:01.948 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 85899345927 2026-03-08T23:22:01.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345927 2026-03-08T23:22:01.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 85899345927' 2026-03-08T23:22:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:22:02.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345927 -lt 85899345927 2026-03-08T23:22:02.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:706: TEST_repair_stats: ceph pg 1.0 query 2026-03-08T23:22:02.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:706: TEST_repair_stats: jq .info.stats.stat_sum 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 210, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 30, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 60, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 30, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 30, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 30, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 20, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 140, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-08T23:22:02.196 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 20 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:707: TEST_repair_stats: ceph pg 1.0 query 2026-03-08T23:22:02.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:707: TEST_repair_stats: jq .info.stats.stat_sum.num_objects_repaired 2026-03-08T23:22:02.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:707: TEST_repair_stats: COUNT=20 2026-03-08T23:22:02.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:708: TEST_repair_stats: test 20 = 20 2026-03-08T23:22:02.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:710: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:02.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:710: TEST_repair_stats: jq '.pg_map.osd_stats[] | select(.osd == 1 )' 2026-03-08T23:22:02.427 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 20, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 85899345927, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 1, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 1, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 1, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 1, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 104857600, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 28560, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 320, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 4, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 28219, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 104829040, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "total": 107374182400, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "available": 107344936960, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 327680, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 53466, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 4382, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 28896994 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [ 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: 0 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 10, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [] 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:22:02.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:711: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:02.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:711: TEST_repair_stats: jq '.pg_map.osd_stats[] | select(.osd == 1 ).num_shards_repaired' 2026-03-08T23:22:02.591 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:02.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:711: TEST_repair_stats: COUNT=10 2026-03-08T23:22:02.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:712: TEST_repair_stats: expr 20 / 2 2026-03-08T23:22:02.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:712: TEST_repair_stats: test 10 = 10 2026-03-08T23:22:02.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:714: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:02.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:714: TEST_repair_stats: jq '.pg_map.osd_stats[] | select(.osd == 0 )' 2026-03-08T23:22:02.765 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 23, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 98784247814, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 1, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 1, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 1, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 1, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 104857600, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 28560, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 320, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 4, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 28219, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 104829040, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "total": 107374182400, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "available": 107344936960, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 327680, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 53466, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 4386, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 28896990 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [ 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 10, 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:22:02.777 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [] 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:715: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:02.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:715: TEST_repair_stats: jq '.pg_map.osd_stats[] | select(.osd == 0 ).num_shards_repaired' 2026-03-08T23:22:02.936 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:02.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:715: TEST_repair_stats: COUNT=10 2026-03-08T23:22:02.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:716: TEST_repair_stats: expr 20 / 2 2026-03-08T23:22:02.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:716: TEST_repair_stats: test 10 = 10 2026-03-08T23:22:02.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:718: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:02.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:718: TEST_repair_stats: jq .pg_map.osd_stats_sum 2026-03-08T23:22:03.101 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:03.113 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:22:03.113 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 0, 2026-03-08T23:22:03.113 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:22:03.113 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 2, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 2, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 2, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 2, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 209715200, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 57120, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 640, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 8, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 56439, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 209658080, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "total": 214748364800, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "available": 214689873920, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 655360, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 106932, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 8768, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 57793984 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [], 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 20, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [], 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout: "network_ping_times": [] 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:719: TEST_repair_stats: ceph pg dump --format=json-pretty 2026-03-08T23:22:03.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:719: TEST_repair_stats: jq .pg_map.osd_stats_sum.num_shards_repaired 2026-03-08T23:22:03.262 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:719: TEST_repair_stats: COUNT=20 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:720: TEST_repair_stats: test 20 = 20 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:22:03.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:22:03.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:22:03.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:22:03.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:22:03.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:22:03.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:22:03.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:22:03.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:22:03.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:22:03.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:22:03.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:22:03.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:22:03.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:22:03.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:22:03.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:22:03.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:22:03.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:22:03.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:22:03.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:22:03.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:22:03.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:22:03.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:22:03.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:22:03.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:22:03.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:22:03.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:22:03.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:22:03.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:22:03.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:22:03.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:22:03.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:22:03.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:22:03.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:22:03.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:22:03.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:22:03.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:22:03.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:22:03.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:22:03.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:22:03.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:22:03.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:22:03.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:22:03.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:22:03.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:22:03.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_repair_stats_ec td/osd-scrub-repair 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:724: TEST_repair_stats_ec: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:725: TEST_repair_stats_ec: local poolname=testpool 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:726: TEST_repair_stats_ec: local OSDS=3 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:727: TEST_repair_stats_ec: local OBJS=30 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:729: TEST_repair_stats_ec: local REPAIRS=26 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:730: TEST_repair_stats_ec: local allow_overwrites=false 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:733: TEST_repair_stats_ec: run_mon td/osd-scrub-repair a 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:22:03.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:03.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:22:03.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:22:03.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:22:03.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:22:03.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:22:03.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:22:03.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:22:03.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:22:03.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:22:03.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:22:03.479 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:22:03.479 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.479 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.479 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:22:03.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:22:03.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:22:03.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:22:03.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:22:03.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:22:03.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:22:03.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:22:03.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:22:03.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:22:03.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:22:03.544 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:22:03.545 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.545 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.545 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:22:03.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:22:03.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:734: TEST_repair_stats_ec: run_mgr td/osd-scrub-repair x 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:22:03.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:22:03.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:03.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:22:03.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:22:03.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:736: TEST_repair_stats_ec: local 'ceph_osd_args=--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:22:03.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:737: TEST_repair_stats_ec: expr 3 - 1 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:737: TEST_repair_stats_ec: seq 0 2 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:737: TEST_repair_stats_ec: for id in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:738: TEST_repair_stats_ec: run_osd td/osd-scrub-repair 0 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:03.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:03.744 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:03.744 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:03.744 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:03.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:03.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:22:03.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:22:03.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:22:03.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=81b50715-993d-4f6c-beaa-5924fc90edc4 2026-03-08T23:22:03.747 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 81b50715-993d-4f6c-beaa-5924fc90edc4 2026-03-08T23:22:03.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 81b50715-993d-4f6c-beaa-5924fc90edc4' 2026-03-08T23:22:03.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:22:03.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCbBK5pL4K+LRAAan5gpIK9990P/WqiEOrjSQ== 2026-03-08T23:22:03.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCbBK5pL4K+LRAAan5gpIK9990P/WqiEOrjSQ=="}' 2026-03-08T23:22:03.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 81b50715-993d-4f6c-beaa-5924fc90edc4 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:22:03.862 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:22:03.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:22:03.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQCbBK5pL4K+LRAAan5gpIK9990P/WqiEOrjSQ== --osd-uuid 81b50715-993d-4f6c-beaa-5924fc90edc4 2026-03-08T23:22:03.893 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:03.896+0000 7f07667358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:03.895 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:03.900+0000 7f07667358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:03.897 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:03.900+0000 7f07667358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:03.897 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:03.904+0000 7f07667358c0 -1 bdev(0x55e624728c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:22:03.897 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:03.904+0000 7f07667358c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:22:06.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:22:06.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:22:06.162 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:22:06.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:22:06.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:22:06.264 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:22:06.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:22:06.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:22:06.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:22:06.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:22:06.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:06.308 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:06.304+0000 7f7773f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:06.317 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:06.324+0000 7f7773f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:06.328 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:06.332+0000 7f7773f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:06.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:22:06.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:06.777 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:06.784+0000 7f7773f8f8c0 -1 Falling back to public interface 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:07.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:22:07.748 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:07.752+0000 7f7773f8f8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:22:07.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:08.714 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:08.720+0000 7f776f748640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:08.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:22:08.938 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3902267615,v1:127.0.0.1:6803/3902267615] [v2:127.0.0.1:6804/3902267615,v1:127.0.0.1:6805/3902267615] exists,up 81b50715-993d-4f6c-beaa-5924fc90edc4 2026-03-08T23:22:08.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:22:08.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:737: TEST_repair_stats_ec: for id in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:738: TEST_repair_stats_ec: run_osd td/osd-scrub-repair 1 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:08.939 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:22:08.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:22:08.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:22:08.943 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 87f4651f-9b73-41ee-b7ad-4910cd30f81c 2026-03-08T23:22:08.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=87f4651f-9b73-41ee-b7ad-4910cd30f81c 2026-03-08T23:22:08.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 87f4651f-9b73-41ee-b7ad-4910cd30f81c' 2026-03-08T23:22:08.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:22:08.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCgBK5puyFSORAA1G9ScppCeNK03/g6LvkpLQ== 2026-03-08T23:22:08.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCgBK5puyFSORAA1G9ScppCeNK03/g6LvkpLQ=="}' 2026-03-08T23:22:08.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 87f4651f-9b73-41ee-b7ad-4910cd30f81c -i td/osd-scrub-repair/1/new.json 2026-03-08T23:22:09.112 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:22:09.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:22:09.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQCgBK5puyFSORAA1G9ScppCeNK03/g6LvkpLQ== --osd-uuid 87f4651f-9b73-41ee-b7ad-4910cd30f81c 2026-03-08T23:22:09.142 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:09.148+0000 7fdf3550a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:09.144 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:09.152+0000 7fdf3550a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:09.145 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:09.152+0000 7fdf3550a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:09.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:09.152+0000 7fdf3550a8c0 -1 bdev(0x556aa3f75c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:22:09.146 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:09.152+0000 7fdf3550a8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:22:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:22:11.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:22:11.642 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:22:11.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:22:11.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:22:11.853 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:22:11.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:22:11.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:11.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:22:11.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:22:11.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:22:11.871 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:11.876+0000 7efc026818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:11.871 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:11.876+0000 7efc026818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:11.873 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:11.876+0000 7efc026818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:12.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:12.585 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:12.592+0000 7efc026818c0 -1 Falling back to public interface 2026-03-08T23:22:13.202 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:22:13.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:13.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:13.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:22:13.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:13.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:13.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:13.570 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:13.576+0000 7efc026818c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:14.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:14.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:15.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2202245225,v1:127.0.0.1:6811/2202245225] [v2:127.0.0.1:6812/2202245225,v1:127.0.0.1:6813/2202245225] exists,up 87f4651f-9b73-41ee-b7ad-4910cd30f81c 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:737: TEST_repair_stats_ec: for id in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:738: TEST_repair_stats_ec: run_osd td/osd-scrub-repair 2 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:22:15.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:22:15.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:22:15.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:22:15.771 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 bc4be6ae-c644-46f3-8e5a-bfeec0728b22 2026-03-08T23:22:15.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=bc4be6ae-c644-46f3-8e5a-bfeec0728b22 2026-03-08T23:22:15.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 bc4be6ae-c644-46f3-8e5a-bfeec0728b22' 2026-03-08T23:22:15.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:22:15.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCnBK5paYUsLxAAvWtbwDpaEMybjKkZoo/6Vw== 2026-03-08T23:22:15.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCnBK5paYUsLxAAvWtbwDpaEMybjKkZoo/6Vw=="}' 2026-03-08T23:22:15.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new bc4be6ae-c644-46f3-8e5a-bfeec0728b22 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:22:15.947 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:22:15.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:22:15.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 --mkfs --key AQCnBK5paYUsLxAAvWtbwDpaEMybjKkZoo/6Vw== --osd-uuid bc4be6ae-c644-46f3-8e5a-bfeec0728b22 2026-03-08T23:22:15.978 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:15.984+0000 7f5ce9a488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:15.980 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:15.984+0000 7f5ce9a488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:15.981 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:15.988+0000 7f5ce9a488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:15.981 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:15.988+0000 7f5ce9a488c0 -1 bdev(0x55ac699efc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:22:15.981 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:15.988+0000 7f5ce9a488c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:22:18.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:22:18.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:22:18.238 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:22:18.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:22:18.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:22:18.444 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:22:18.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:22:18.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:18.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:22:18.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:22:18.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:22:18.461 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:18.464+0000 7fdd5f2a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:18.461 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:18.468+0000 7fdd5f2a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:18.462 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:18.468+0000 7fdd5f2a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:22:18.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:19.665 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:19.672+0000 7fdd5f2a58c0 -1 Falling back to public interface 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:19.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:22:19.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:20.645 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:20.652+0000 7fdd5f2a58c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:20.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:22:21.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:21.646 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:21.652+0000 7fdd5aa5e640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1755430320,v1:127.0.0.1:6819/1755430320] [v2:127.0.0.1:6820/1755430320,v1:127.0.0.1:6821/1755430320] exists,up bc4be6ae-c644-46f3-8e5a-bfeec0728b22 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:742: TEST_repair_stats_ec: create_ec_pool testpool false k=2 m=1 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=testpool 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:22:22.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=1 2026-03-08T23:22:22.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool testpool 1 1 erasure myprofile 2026-03-08T23:22:22.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create testpool 1 1 erasure myprofile 2026-03-08T23:22:22.872 INFO:tasks.workunit.client.0.vm03.stderr:pool 'testpool' created 2026-03-08T23:22:22.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:22:23.890 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:22:23.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:22:23.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:22:23.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:22:23.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:22:24.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:22:24.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:22:24.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:22:24.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:22:24.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:22:24.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:22:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T23:22:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T23:22:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963' 2026-03-08T23:22:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:22:24.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963 2-64424509442' 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:22:24.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:22:24.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:22:24.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:22:24.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:22:24.343 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:22:24.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:22:24.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:22:24.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:22:24.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T23:22:24.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:22:24.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T23:22:24.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:22:24.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:22:24.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T23:22:24.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:22:24.513 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T23:22:24.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T23:22:24.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T23:22:24.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:22:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T23:22:24.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:22:24.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T23:22:24.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:22:24.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:22:24.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T23:22:24.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:22:24.685 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T23:22:24.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T23:22:24.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T23:22:24.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:22:24.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T23:22:24.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:22:24.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:22:24.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:22:25.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:22:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:22:25.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:22:25.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:22:25.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:22:25.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:745: TEST_repair_stats_ec: local payload=ABCDEF 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:746: TEST_repair_stats_ec: echo ABCDEF 2026-03-08T23:22:25.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: seq 1 30 2026-03-08T23:22:25.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj1 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj2 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj3 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj4 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj5 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj6 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj7 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj8 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj9 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj10 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj11 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj12 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj13 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj14 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj15 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj16 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj17 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj18 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj19 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj20 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj21 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj22 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj23 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj24 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj25 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj26 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:25.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:25.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj27 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:26.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:26.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj28 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:26.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:26.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj29 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:26.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:747: TEST_repair_stats_ec: for i in $(seq 1 $OBJS) 2026-03-08T23:22:26.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:749: TEST_repair_stats_ec: rados --pool testpool put obj30 td/osd-scrub-repair/ORIGINAL 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:754: TEST_repair_stats_ec: get_not_primary testpool obj1 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=testpool 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary testpool obj1 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T23:22:26.070 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:22:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:22:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map testpool obj1 2026-03-08T23:22:26.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:22:26.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:754: TEST_repair_stats_ec: local other=0 2026-03-08T23:22:26.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:755: TEST_repair_stats_ec: get_pg testpool obj1 2026-03-08T23:22:26.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=testpool 2026-03-08T23:22:26.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=obj1 2026-03-08T23:22:26.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map testpool obj1 2026-03-08T23:22:26.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:22:26.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:755: TEST_repair_stats_ec: local pgid=1.0 2026-03-08T23:22:26.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:756: TEST_repair_stats_ec: get_primary testpool obj1 2026-03-08T23:22:26.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=testpool 2026-03-08T23:22:26.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:22:26.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map testpool obj1 2026-03-08T23:22:26.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:756: TEST_repair_stats_ec: local primary=1 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:758: TEST_repair_stats_ec: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:22:26.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:759: TEST_repair_stats_ec: kill_daemons td/osd-scrub-repair TERM osd.1 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:22:26.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:22:26.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:22:26.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: seq 1 26 2026-03-08T23:22:26.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:26.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 1 % 2 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj1 remove 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:26.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj1 remove 2026-03-08T23:22:27.629 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:ff7b1f36:::obj1:head# 2026-03-08T23:22:28.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:28.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 2 % 2 2026-03-08T23:22:28.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:28.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj2 remove 2026-03-08T23:22:28.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:28.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:28.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:28.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:28.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:28.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj2 remove 2026-03-08T23:22:28.826 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:104778fc:::obj2:head# 2026-03-08T23:22:29.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:29.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 3 % 2 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj3 remove 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:29.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:29.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:29.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj3 remove 2026-03-08T23:22:29.992 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:8dd16f86:::obj3:head# 2026-03-08T23:22:30.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:30.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 4 % 2 2026-03-08T23:22:30.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj4 remove 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:30.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj4 remove 2026-03-08T23:22:31.140 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:0ee9ae15:::obj4:head# 2026-03-08T23:22:31.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:31.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 5 % 2 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj5 remove 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:31.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj5 remove 2026-03-08T23:22:32.313 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:c52c9666:::obj5:head# 2026-03-08T23:22:32.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:32.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 6 % 2 2026-03-08T23:22:32.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj6 remove 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:32.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj6 remove 2026-03-08T23:22:33.479 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:233a42c1:::obj6:head# 2026-03-08T23:22:34.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:34.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 7 % 2 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj7 remove 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj7 remove 2026-03-08T23:22:34.640 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:9a09113a:::obj7:head# 2026-03-08T23:22:35.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:35.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 8 % 2 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj8 remove 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:35.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj8 remove 2026-03-08T23:22:35.825 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:a557efb1:::obj8:head# 2026-03-08T23:22:36.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:36.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 9 % 2 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj9 remove 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:36.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj9 remove 2026-03-08T23:22:36.993 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:fe351e27:::obj9:head# 2026-03-08T23:22:37.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:37.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 10 % 2 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj10 remove 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:37.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj10 remove 2026-03-08T23:22:38.161 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:d1337354:::obj10:head# 2026-03-08T23:22:38.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:38.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 11 % 2 2026-03-08T23:22:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj11 remove 2026-03-08T23:22:38.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:38.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:38.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:38.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:38.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:38.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj11 remove 2026-03-08T23:22:39.315 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:eeae2a94:::obj11:head# 2026-03-08T23:22:39.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 12 % 2 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj12 remove 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:39.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj12 remove 2026-03-08T23:22:40.477 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:af5f95cb:::obj12:head# 2026-03-08T23:22:41.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 13 % 2 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj13 remove 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:41.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj13 remove 2026-03-08T23:22:41.627 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:a61ad63e:::obj13:head# 2026-03-08T23:22:42.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:42.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 14 % 2 2026-03-08T23:22:42.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:42.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj14 remove 2026-03-08T23:22:42.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:42.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:42.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:42.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:42.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:42.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj14 remove 2026-03-08T23:22:42.788 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:e0a44829:::obj14:head# 2026-03-08T23:22:43.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:43.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 15 % 2 2026-03-08T23:22:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj15 remove 2026-03-08T23:22:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:43.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:43.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:43.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj15 remove 2026-03-08T23:22:44.187 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:ab946124:::obj15:head# 2026-03-08T23:22:44.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:44.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 16 % 2 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj16 remove 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:44.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj16 remove 2026-03-08T23:22:45.347 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:461f8b5e:::obj16:head# 2026-03-08T23:22:45.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:45.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 17 % 2 2026-03-08T23:22:45.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj17 remove 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:45.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj17 remove 2026-03-08T23:22:46.504 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:24fc6e92:::obj17:head# 2026-03-08T23:22:47.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:47.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 18 % 2 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj18 remove 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:47.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj18 remove 2026-03-08T23:22:47.664 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:b9836a99:::obj18:head# 2026-03-08T23:22:48.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:48.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 19 % 2 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj19 remove 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:48.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj19 remove 2026-03-08T23:22:48.839 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:00b31b6c:::obj19:head# 2026-03-08T23:22:49.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:49.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 20 % 2 2026-03-08T23:22:49.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:49.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj20 remove 2026-03-08T23:22:49.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:49.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:49.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:49.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:49.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:49.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj20 remove 2026-03-08T23:22:50.006 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:24b5611b:::obj20:head# 2026-03-08T23:22:50.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:50.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 21 % 2 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj21 remove 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj21 remove 2026-03-08T23:22:51.172 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:1243fe1a:::obj21:head# 2026-03-08T23:22:51.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:51.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 22 % 2 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj22 remove 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:51.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj22 remove 2026-03-08T23:22:52.329 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:c0c49def:::obj22:head# 2026-03-08T23:22:52.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:52.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 23 % 2 2026-03-08T23:22:52.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:52.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj23 remove 2026-03-08T23:22:52.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:52.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:52.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:52.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:52.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:52.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj23 remove 2026-03-08T23:22:53.492 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:242c40e0:::obj23:head# 2026-03-08T23:22:54.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:54.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 24 % 2 2026-03-08T23:22:54.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:54.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj24 remove 2026-03-08T23:22:54.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:54.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:54.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:54.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:54.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:54.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj24 remove 2026-03-08T23:22:54.665 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:7eacede5:::obj24:head# 2026-03-08T23:22:55.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:55.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 25 % 2 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=1 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 1 obj25 remove 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/1 obj25 remove 2026-03-08T23:22:55.825 INFO:tasks.workunit.client.0.vm03.stdout:remove 0#1:7109478f:::obj25:head# 2026-03-08T23:22:56.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:760: TEST_repair_stats_ec: for i in $(seq 1 $REPAIRS) 2026-03-08T23:22:56.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: expr 26 % 2 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:763: TEST_repair_stats_ec: OSD=0 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:764: TEST_repair_stats_ec: _objectstore_tool_nodown td/osd-scrub-repair 0 obj26 remove 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:22:56.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 obj26 remove 2026-03-08T23:22:57.013 INFO:tasks.workunit.client.0.vm03.stdout:remove 1#1:f3c2593a:::obj26:head# 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:766: TEST_repair_stats_ec: activate_osd td/osd-scrub-repair 1 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:22:57.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:22:57.548 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:22:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:22:57.550 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:22:57.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:22:57.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:22:57.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/1/whoami 2026-03-08T23:22:57.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:22:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:22:57.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:22:57.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:22:57.568 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:57.572+0000 7fbf9fbb98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:57.568 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:57.576+0000 7fbf9fbb98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:57.570 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:57.576+0000 7fbf9fbb98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:22:57.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:57.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:57.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:58.017 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:58.024+0000 7fbf9fbb98c0 -1 Falling back to public interface 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:22:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:22:58.981 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:58.988+0000 7fbf9fbb98c0 -1 osd.1 21 log_to_monitors true 2026-03-08T23:22:59.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:22:59.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:22:59.844+0000 7fbf96b69640 -1 osd.1 21 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:00.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 25 up_thru 25 down_at 22 last_clean_interval [10,21) [v2:127.0.0.1:6802/1739643096,v1:127.0.0.1:6803/1739643096] [v2:127.0.0.1:6804/1739643096,v1:127.0.0.1:6805/1739643096] exists,up 87f4651f-9b73-41ee-b7ad-4910cd30f81c 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:767: TEST_repair_stats_ec: activate_osd td/osd-scrub-repair 0 --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:23:00.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:00.218 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0' 2026-03-08T23:23:00.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:23:00.220 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:23:00.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:23:00.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd-scrub-interval-randomize-ratio=0 2026-03-08T23:23:00.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:23:00.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:23:00.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:23:00.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:23:00.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:23:00.238 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:00.244+0000 7fe25b7cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:00.238 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:00.244+0000 7fe25b7cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:00.239 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:00.244+0000 7fe25b7cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:00.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:00.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:00.957 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:00.964+0000 7fe25b7cf8c0 -1 Falling back to public interface 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:01.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:01.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:02.191 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:02.196+0000 7fe25b7cf8c0 -1 osd.0 20 log_to_monitors true 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:02.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:02.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:03.149 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:03.156+0000 7fe25277f640 -1 osd.0 20 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:23:03.930 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:23:03.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:03.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:03.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:23:03.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:03.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 28 up_thru 0 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6810/3257448117,v1:127.0.0.1:6811/3257448117] [v2:127.0.0.1:6812/3257448117,v1:127.0.0.1:6813/3257448117] exists,up 81b50715-993d-4f6c-beaa-5924fc90edc4 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:768: TEST_repair_stats_ec: wait_for_clean 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:23:04.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:23:04.100 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:23:04.100 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:23:04.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:23:04.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:23:04.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:23:04.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:23:04.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:23:04.336 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:23:04.336 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:23:04.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:23:04.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:04.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:23:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084290 2026-03-08T23:23:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084290 2026-03-08T23:23:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084290' 2026-03-08T23:23:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:04.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:23:04.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182403 2026-03-08T23:23:04.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182403 2026-03-08T23:23:04.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084290 1-107374182403' 2026-03-08T23:23:04.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:04.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509451 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509451 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084290 1-107374182403 2-64424509451' 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-120259084290 2026-03-08T23:23:04.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:04.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:23:04.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-120259084290 2026-03-08T23:23:04.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:04.570 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 120259084290 2026-03-08T23:23:04.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084290 2026-03-08T23:23:04.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 120259084290' 2026-03-08T23:23:04.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 120259084290 2026-03-08T23:23:04.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:05.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:23:05.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:05.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 120259084290 2026-03-08T23:23:05.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:06.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:23:06.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:07.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084290 -lt 120259084290 2026-03-08T23:23:07.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:07.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-107374182403 2026-03-08T23:23:07.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:07.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:23:07.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-107374182403 2026-03-08T23:23:07.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:07.082 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 107374182403 2026-03-08T23:23:07.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182403 2026-03-08T23:23:07.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 107374182403' 2026-03-08T23:23:07.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:23:07.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182403 2026-03-08T23:23:07.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:07.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509451 2026-03-08T23:23:07.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:07.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:23:07.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509451 2026-03-08T23:23:07.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:07.250 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509451 2026-03-08T23:23:07.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509451 2026-03-08T23:23:07.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509451' 2026-03-08T23:23:07.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:23:07.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509451 -lt 64424509451 2026-03-08T23:23:07.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:23:07.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:07.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:23:07.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:23:07.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:23:07.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:23:07.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:07.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:07.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:23:07.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:23:07.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:23:07.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:770: TEST_repair_stats_ec: repair 1.0 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:07.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:08.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:08.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T23:23:08.276 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0s0 on osd.1 to repair 2026-03-08T23:23:08.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:08.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:08.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:08.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:09.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:09.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:09.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:10.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:10.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:10.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:11.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:11.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:11.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:11.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:11.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:11.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:11.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:11.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:11.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:12.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:13.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:13.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:14.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:14.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:14.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:22:22.878765+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:14.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:15.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:09.106403+0000 '>' 2026-03-08T23:22:22.878765+0000 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:771: TEST_repair_stats_ec: wait_for_clean 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:23:15.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:23:15.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:23:15.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:23:15.646 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:23:15.646 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:23:15.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:23:15.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:15.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:23:15.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084293 2026-03-08T23:23:15.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084293 2026-03-08T23:23:15.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084293' 2026-03-08T23:23:15.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:15.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:23:15.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182406 2026-03-08T23:23:15.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182406 2026-03-08T23:23:15.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084293 1-107374182406' 2026-03-08T23:23:15.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:15.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509454 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509454 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084293 1-107374182406 2-64424509454' 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-120259084293 2026-03-08T23:23:15.901 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:15.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:23:15.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-120259084293 2026-03-08T23:23:15.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:15.905 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 120259084293 2026-03-08T23:23:15.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084293 2026-03-08T23:23:15.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 120259084293' 2026-03-08T23:23:15.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084292 -lt 120259084293 2026-03-08T23:23:16.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:17.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:23:17.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:17.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084293 -lt 120259084293 2026-03-08T23:23:17.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:17.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-107374182406 2026-03-08T23:23:17.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:17.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:23:17.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-107374182406 2026-03-08T23:23:17.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:17.250 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 107374182406 2026-03-08T23:23:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182406 2026-03-08T23:23:17.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 107374182406' 2026-03-08T23:23:17.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:23:17.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182406 -lt 107374182406 2026-03-08T23:23:17.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:17.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509454 2026-03-08T23:23:17.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:17.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:23:17.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509454 2026-03-08T23:23:17.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:17.425 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509454 2026-03-08T23:23:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509454 2026-03-08T23:23:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509454' 2026-03-08T23:23:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:23:17.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509454 -lt 64424509454 2026-03-08T23:23:17.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:23:17.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:17.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:23:17.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:23:17.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:23:17.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:23:17.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:17.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:23:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:23:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:23:18.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:772: TEST_repair_stats_ec: ceph pg dump pgs 2026-03-08T23:23:18.287 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T23:23:18.287 INFO:tasks.workunit.client.0.vm03.stdout:1.0 30 0 0 0 0 210 0 0 30 0 30 active+clean 2026-03-08T23:23:09.190809+0000 20'30 29:131 [1,0,2] 1 [1,0,2] 1 20'30 2026-03-08T23:23:09.106403+0000 20'30 2026-03-08T23:23:09.106403+0000 0 1 periodic scrub scheduled @ 2026-03-09T23:23:09.106403+0000 17 0 2026-03-08T23:23:18.287 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T23:23:18.287 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T23:23:18.287 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:23:18.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:773: TEST_repair_stats_ec: flush_pg_stats 2026-03-08T23:23:18.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:23:18.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:18.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:23:18.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084295 2026-03-08T23:23:18.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084295 2026-03-08T23:23:18.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084295' 2026-03-08T23:23:18.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:18.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:23:18.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182407 2026-03-08T23:23:18.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182407 2026-03-08T23:23:18.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084295 1-107374182407' 2026-03-08T23:23:18.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:18.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:23:18.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509456 2026-03-08T23:23:18.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509456 2026-03-08T23:23:18.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-120259084295 1-107374182407 2-64424509456' 2026-03-08T23:23:18.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:18.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-120259084295 2026-03-08T23:23:18.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:18.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:23:18.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-120259084295 2026-03-08T23:23:18.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:18.686 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 120259084295 2026-03-08T23:23:18.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084295 2026-03-08T23:23:18.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 120259084295' 2026-03-08T23:23:18.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:18.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084294 -lt 120259084295 2026-03-08T23:23:18.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:19.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:23:19.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:20.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084294 -lt 120259084295 2026-03-08T23:23:20.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:21.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:23:21.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:21.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084295 -lt 120259084295 2026-03-08T23:23:21.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:21.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-107374182407 2026-03-08T23:23:21.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:21.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:23:21.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-107374182407 2026-03-08T23:23:21.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:21.172 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 107374182407 2026-03-08T23:23:21.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182407 2026-03-08T23:23:21.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 107374182407' 2026-03-08T23:23:21.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:23:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182408 -lt 107374182407 2026-03-08T23:23:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:21.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:21.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509456 2026-03-08T23:23:21.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:23:21.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509456 2026-03-08T23:23:21.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:21.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509456 2026-03-08T23:23:21.333 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509456 2026-03-08T23:23:21.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509456' 2026-03-08T23:23:21.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:23:21.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509456 -lt 64424509456 2026-03-08T23:23:21.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:776: TEST_repair_stats_ec: ceph pg 1.0 query 2026-03-08T23:23:21.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:776: TEST_repair_stats_ec: jq .info.stats.stat_sum 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes": 210, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects": 30, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_clones": 0, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_object_copies": 90, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing_on_primary": 0, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_missing": 0, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_degraded": 0, 2026-03-08T23:23:21.574 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_misplaced": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_unfound": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_dirty": 30, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_whiteouts": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_read": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_read_kb": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_write": 30, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_write_kb": 30, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrub_errors": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_shallow_scrub_errors": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_deep_scrub_errors": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_recovered": 26, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_recovered": 182, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_keys_recovered": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_omap": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_hit_set_archive": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_bytes_hit_set_archive": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_kb": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_kb": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_promote": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_high": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_flush_mode_low": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_some": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_evict_mode_full": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_pinned": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_legacy_snapsets": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_large_omap_objects": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_manifest": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_bytes": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_omap_keys": 0, 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout: "num_objects_repaired": 26 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:777: TEST_repair_stats_ec: ceph pg 1.0 query 2026-03-08T23:23:21.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:777: TEST_repair_stats_ec: jq .info.stats.stat_sum.num_objects_repaired 2026-03-08T23:23:21.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:777: TEST_repair_stats_ec: COUNT=26 2026-03-08T23:23:21.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:778: TEST_repair_stats_ec: test 26 = 26 2026-03-08T23:23:21.658 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:780: TEST_repair_stats_ec: expr 3 - 1 2026-03-08T23:23:21.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:780: TEST_repair_stats_ec: seq 0 2 2026-03-08T23:23:21.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:780: TEST_repair_stats_ec: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:23:21.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:781: TEST_repair_stats_ec: '[' 0 = 0 -o 0 = 1 ']' 2026-03-08T23:23:21.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:782: TEST_repair_stats_ec: expr 26 / 2 2026-03-08T23:23:21.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:782: TEST_repair_stats_ec: repair=13 2026-03-08T23:23:21.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:21.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 0 )' 2026-03-08T23:23:21.822 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 28, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 120259084295, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 1, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 1, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 1, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 1, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 104857600, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 28472, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 360, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 4, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 28091, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 104829128, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "total": 107374182400, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "available": 107345027072, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 368640, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 200261, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 4388, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 28765916 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [ 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: 1, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:23:21.836 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 13, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [] 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:21.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 0 ).num_shards_repaired' 2026-03-08T23:23:21.998 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: COUNT=13 2026-03-08T23:23:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:789: TEST_repair_stats_ec: test 13 = 13 2026-03-08T23:23:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:780: TEST_repair_stats_ec: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:23:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:781: TEST_repair_stats_ec: '[' 1 = 0 -o 1 = 1 ']' 2026-03-08T23:23:22.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:782: TEST_repair_stats_ec: expr 26 / 2 2026-03-08T23:23:22.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:782: TEST_repair_stats_ec: repair=13 2026-03-08T23:23:22.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 1 )' 2026-03-08T23:23:22.172 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 25, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 107374182408, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 1, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 1, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 1, 2026-03-08T23:23:22.183 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 1, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 104857600, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 28472, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 360, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 4, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 28091, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 104829128, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "total": 107374182400, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "available": 107345027072, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 368640, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 200261, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 4383, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 28765921 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [ 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: 2 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 13, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [] 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 1 ).num_shards_repaired' 2026-03-08T23:23:22.333 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: COUNT=13 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:789: TEST_repair_stats_ec: test 13 = 13 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:780: TEST_repair_stats_ec: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:781: TEST_repair_stats_ec: '[' 2 = 0 -o 2 = 1 ']' 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:784: TEST_repair_stats_ec: repair=0 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:787: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 2 )' 2026-03-08T23:23:22.496 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 2, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 15, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 64424509457, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 1, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 1, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 1, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 1, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 104857600, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 27200, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 360, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 1, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 26814, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 104830400, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "total": 107374182400, 2026-03-08T23:23:22.508 INFO:tasks.workunit.client.0.vm03.stdout: "available": 107346329600, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 368640, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 200261, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 1590, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 27457994 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [ 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [] 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: jq '.pg_map.osd_stats[] | select(.osd == 2 ).num_shards_repaired' 2026-03-08T23:23:22.670 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:788: TEST_repair_stats_ec: COUNT=0 2026-03-08T23:23:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:789: TEST_repair_stats_ec: test 0 = 0 2026-03-08T23:23:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:792: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:792: TEST_repair_stats_ec: jq .pg_map.osd_stats_sum 2026-03-08T23:23:22.829 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:22.841 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "up_from": 0, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_pgs": 3, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_osds": 3, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_osds": 3, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_per_pool_omap_osds": 3, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb": 314572800, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used": 84144, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_data": 1080, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_omap": 10, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb_used_meta": 82997, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "kb_avail": 314488656, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "statfs": { 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "total": 322122547200, 2026-03-08T23:23:22.860 INFO:tasks.workunit.client.0.vm03.stdout: "available": 322036383744, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "internally_reserved": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "allocated": 1105920, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_stored": 600783, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_allocated": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_compressed_original": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "omap_allocated": 10361, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "internal_metadata": 84989831 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "hb_peers": [], 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "snap_trim_queue_len": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "num_snap_trimming": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "num_shards_repaired": 26, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "op_queue_age_hist": { 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "histogram": [], 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "upper_bound": 1 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "perf_stat": { 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ms": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ms": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "commit_latency_ns": 0, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "apply_latency_ns": 0 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "alerts": [], 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout: "network_ping_times": [] 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:793: TEST_repair_stats_ec: ceph pg dump --format=json-pretty 2026-03-08T23:23:22.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:793: TEST_repair_stats_ec: jq .pg_map.osd_stats_sum.num_shards_repaired 2026-03-08T23:23:22.994 INFO:tasks.workunit.client.0.vm03.stderr:dumped all 2026-03-08T23:23:23.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:793: TEST_repair_stats_ec: COUNT=26 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:794: TEST_repair_stats_ec: test 26 = 26 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:23:23.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:23:23.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:23:23.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:23:23.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:23:23.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:23:23.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:23:23.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:23:23.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:23:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:23:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:23:23.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:23:23.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:23:23.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:23:23.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:23:23.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:23:23.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:23:23.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:23:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:23:23.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:23:23.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:23:23.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:23:23.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:23:23.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:23:23.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:23:23.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:23:23.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:23:23.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:23:23.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:23:23.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:23:23.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:23:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:23:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:23:23.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:23:23.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:23:23.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:23:23.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:23:23.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:23:23.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:23:23.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_request_scrub_priority td/osd-scrub-repair 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6197: TEST_request_scrub_priority: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6198: TEST_request_scrub_priority: local poolname=psr_pool 2026-03-08T23:23:23.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6199: TEST_request_scrub_priority: local objname=POBJ 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6200: TEST_request_scrub_priority: local OBJECTS=64 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6201: TEST_request_scrub_priority: local PGS=8 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6203: TEST_request_scrub_priority: run_mon td/osd-scrub-repair a --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:23:23.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:23:23.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:23:23.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:23:23.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:23:23.193 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:23:23.193 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.193 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:23.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:23:23.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:23:23.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:23:23.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:23:23.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:23:23.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:23:23.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.231 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.232 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:23:23.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:23:23.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:23:23.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:23:23.296 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:23:23.296 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.296 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.296 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:23:23.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:23:23.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6204: TEST_request_scrub_priority: run_mgr td/osd-scrub-repair x 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:23:23.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:23:23.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6205: TEST_request_scrub_priority: local 'ceph_osd_args=--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 ' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6206: TEST_request_scrub_priority: ceph_osd_args+=--osd_scrub_backoff_ratio=0 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6207: TEST_request_scrub_priority: run_osd td/osd-scrub-repair 0 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:23:23.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:23:23.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:23:23.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:23:23.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:23:23.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:23.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:23:23.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:23:23.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:23:23.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0' 2026-03-08T23:23:23.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:23:23.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:23:23.519 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5 2026-03-08T23:23:23.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5 2026-03-08T23:23:23.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5' 2026-03-08T23:23:23.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:23:23.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDrBK5p7Y4kIBAA46xuFiRvr7D8Q21pl7WDEA== 2026-03-08T23:23:23.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDrBK5p7Y4kIBAA46xuFiRvr7D8Q21pl7WDEA=="}' 2026-03-08T23:23:23.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:23:23.642 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:23:23.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:23:23.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --mkfs --key AQDrBK5p7Y4kIBAA46xuFiRvr7D8Q21pl7WDEA== --osd-uuid afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5 2026-03-08T23:23:23.674 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:23.680+0000 7fe9091dc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:23.676 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:23.684+0000 7fe9091dc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:23.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:23.684+0000 7fe9091dc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:23.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:23.684+0000 7fe9091dc8c0 -1 bdev(0x55852060ec00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:23:23.678 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:23.684+0000 7fe9091dc8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:23:25.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:23:25.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:23:25.944 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:23:25.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:23:25.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:23:26.052 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:23:26.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:23:26.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:23:26.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:23:26.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:23:26.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 2026-03-08T23:23:26.122 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:26.128+0000 7f1048b478c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:26.128 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:26.132+0000 7f1048b478c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:26.130 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:26.136+0000 7f1048b478c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:23:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:26.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:26.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:26.565 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:26.572+0000 7f1048b478c0 -1 Falling back to public interface 2026-03-08T23:23:27.382 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:23:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:23:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:27.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:27.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:27.782 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:27.788+0000 7f1048b478c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:23:28.546 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:23:28.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:28.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:28.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:23:28.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:28.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:28.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:23:28.758 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:23:28.764+0000 7f1044300640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:23:29.740 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:23:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:23:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:23:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:23:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:23:29.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:23:29.911 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3144897988,v1:127.0.0.1:6803/3144897988] [v2:127.0.0.1:6804/3144897988,v1:127.0.0.1:6805/3144897988] exists,up afe1dc65-6aff-45e5-8dd2-e4bdbfb6aaa5 2026-03-08T23:23:29.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:23:29.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:23:29.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:23:29.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6209: TEST_request_scrub_priority: create_pool psr_pool 8 8 2026-03-08T23:23:29.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool 8 8 2026-03-08T23:23:30.118 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool' created 2026-03-08T23:23:30.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:23:31.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6210: TEST_request_scrub_priority: wait_for_clean 2026-03-08T23:23:31.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:23:31.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:23:31.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:23:31.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:23:31.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:23:31.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:23:31.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:23:31.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:23:31.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:23:31.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:23:31.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:23:31.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:23:31.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:31.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836482 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836482 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836482' 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836482 2026-03-08T23:23:31.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:31.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:23:31.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836482 2026-03-08T23:23:31.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:31.464 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836482 2026-03-08T23:23:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836482 2026-03-08T23:23:31.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836482' 2026-03-08T23:23:31.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:31.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836481 -lt 21474836482 2026-03-08T23:23:31.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:32.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:23:32.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:32.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836482 2026-03-08T23:23:32.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:23:32.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:32.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 8 == 0 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:23:33.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:23:33.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:23:33.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:23:33.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=8 2026-03-08T23:23:33.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:23:33.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:23:33.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 8 = 8 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6212: TEST_request_scrub_priority: local osd=0 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6213: TEST_request_scrub_priority: add_something td/osd-scrub-repair psr_pool POBJ noscrub 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=psr_pool 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=POBJ 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:23:33.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:23:33.681 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:23:33.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:23:33.895 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:23:33.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:23:33.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:23:33.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool psr_pool put POBJ td/osd-scrub-repair/ORIGINAL 2026-03-08T23:23:33.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6214: TEST_request_scrub_priority: get_primary psr_pool POBJ 2026-03-08T23:23:33.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=psr_pool 2026-03-08T23:23:33.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=POBJ 2026-03-08T23:23:33.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map psr_pool POBJ 2026-03-08T23:23:33.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:23:34.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6214: TEST_request_scrub_priority: local primary=0 2026-03-08T23:23:34.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6215: TEST_request_scrub_priority: get_pg psr_pool POBJ 2026-03-08T23:23:34.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=psr_pool 2026-03-08T23:23:34.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=POBJ 2026-03-08T23:23:34.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map psr_pool POBJ 2026-03-08T23:23:34.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:23:34.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6215: TEST_request_scrub_priority: local pg=1.1 2026-03-08T23:23:34.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6216: TEST_request_scrub_priority: ceph osd dump 2026-03-08T23:23:34.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6216: TEST_request_scrub_priority: awk '{ print $2 }' 2026-03-08T23:23:34.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6216: TEST_request_scrub_priority: grep '^pool.*['\'']psr_pool['\'']' 2026-03-08T23:23:34.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6216: TEST_request_scrub_priority: poolid=1 2026-03-08T23:23:34.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6218: TEST_request_scrub_priority: local otherpgs 2026-03-08T23:23:34.520 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: expr 8 - 1 2026-03-08T23:23:34.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: seq 0 7 2026-03-08T23:23:34.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:34.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.0 2026-03-08T23:23:34.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.0 = 1.1 ']' 2026-03-08T23:23:34.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 ' 2026-03-08T23:23:34.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:34.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:34.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:34.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:34.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:34.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:34.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.0 schedule-scrub 1.0 2026-03-08T23:23:34.776 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:34.776 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:34.776 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:34.776 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:54.784399+0000" 2026-03-08T23:23:34.776 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.1 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.1 = 1.1 ']' 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6223: TEST_request_scrub_priority: continue 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.2 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.2 = 1.1 ']' 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 ' 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:34.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:34.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:34.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.2 schedule-scrub 1.2 2026-03-08T23:23:35.038 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:35.038 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:35.038 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:35.038 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:55.046459+0000" 2026-03-08T23:23:35.038 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.3 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.3 = 1.1 ']' 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 1.3 ' 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:35.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:35.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:35.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:35.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:35.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:35.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.3 schedule-scrub 1.3 2026-03-08T23:23:35.328 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:35.328 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:35.328 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:35.328 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:55.336693+0000" 2026-03-08T23:23:35.328 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.4 2026-03-08T23:23:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.4 = 1.1 ']' 2026-03-08T23:23:35.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 1.3 1.4 ' 2026-03-08T23:23:35.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:35.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:35.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:35.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:35.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:35.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:35.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.4 schedule-scrub 1.4 2026-03-08T23:23:35.613 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:35.613 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:35.613 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:35.613 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:55.621832+0000" 2026-03-08T23:23:35.613 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:35.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:35.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.5 2026-03-08T23:23:35.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.5 = 1.1 ']' 2026-03-08T23:23:35.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 1.3 1.4 1.5 ' 2026-03-08T23:23:35.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:35.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:35.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:35.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:35.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:35.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:35.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.5 schedule-scrub 1.5 2026-03-08T23:23:35.866 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:35.866 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:35.866 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:35.866 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:55.874489+0000" 2026-03-08T23:23:35.866 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:35.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:35.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.6 2026-03-08T23:23:35.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.6 = 1.1 ']' 2026-03-08T23:23:35.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 1.3 1.4 1.5 1.6 ' 2026-03-08T23:23:35.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:35.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:35.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:35.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:35.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:36.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:36.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.6 schedule-scrub 1.6 2026-03-08T23:23:36.121 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:36.121 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:36.121 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:36.121 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:56.130109+0000" 2026-03-08T23:23:36.121 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6219: TEST_request_scrub_priority: for i in $(seq 0 $(expr $PGS - 1)) 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6221: TEST_request_scrub_priority: opg=1.7 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6222: TEST_request_scrub_priority: '[' 1.7 = 1.1 ']' 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6225: TEST_request_scrub_priority: otherpgs='1.0 1.2 1.3 1.4 1.5 1.6 1.7 ' 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:36.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:36.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6226: TEST_request_scrub_priority: local other_last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:36.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6228: TEST_request_scrub_priority: ceph tell 1.7 schedule-scrub 1.7 2026-03-08T23:23:36.388 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:36.388 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:36.388 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:23:36.388 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:21:56.396433+0000" 2026-03-08T23:23:36.388 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:36.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6231: TEST_request_scrub_priority: sleep 15 2026-03-08T23:23:51.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6232: TEST_request_scrub_priority: flush_pg_stats 2026-03-08T23:23:51.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:23:51.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:23:51.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:23:51.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:23:51.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:23:51.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:23:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:23:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:23:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:23:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:23:51.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:23:51.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:23:51.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:23:51.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:23:51.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:23:51.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:23:51.656 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T23:23:51.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:23:51.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:51.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:23:51.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:23:52.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:23:52.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:23:52.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T23:23:53.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6235: TEST_request_scrub_priority: get_last_scrub_stamp 1.1 2026-03-08T23:23:53.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:53.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:53.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:53.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:53.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6235: TEST_request_scrub_priority: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:53.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6236: TEST_request_scrub_priority: ceph tell 1.1 scrub 2026-03-08T23:23:53.232 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:23:53.232 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:23:53.232 INFO:tasks.workunit.client.0.vm03.stdout: "must": true, 2026-03-08T23:23:53.232 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "0.000000" 2026-03-08T23:23:53.232 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:23:53.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6238: TEST_request_scrub_priority: ceph osd unset noscrub 2026-03-08T23:23:53.453 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T23:23:53.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6239: TEST_request_scrub_priority: ceph osd unset nodeep-scrub 2026-03-08T23:23:53.662 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6241: TEST_request_scrub_priority: wait_for_scrub 1.1 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.1 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.1 last_scrub_stamp 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:53.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:53.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:53.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:53.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:30.125014+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:53.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:54.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.1 last_scrub_stamp 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:54.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:53.675566+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.0 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:23:55.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:55.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:55.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:55.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:55.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:55.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:55.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:54.784399+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:55.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:56.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:56.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:54.784399+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:56.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:57.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:57.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:57.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:57.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:57.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:57.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:57.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:57.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:54.784399+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:57.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:58.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:58.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:58.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:58.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:58.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:58.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:58.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:58.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:54.784399+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:58.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:23:59.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:23:59.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:54.784399+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:23:59.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:24:00.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:24:00.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:00.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:24:00.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:24:00.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:00.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:00.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:56.668036+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.2 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.2 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.2 last_scrub_stamp 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.2 2026-03-08T23:24:01.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:01.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:01.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.2") | .last_scrub_stamp' 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:54.641287+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.3 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.3 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.3 last_scrub_stamp 2026-03-08T23:24:01.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:24:01.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:01.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:01.200 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:55.691482+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.4 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.4 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.4 last_scrub_stamp 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.4 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:01.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.4") | .last_scrub_stamp' 2026-03-08T23:24:01.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:57.667038+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.5 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.5 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.5 last_scrub_stamp 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.5 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.5") | .last_scrub_stamp' 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:58.658902+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.6 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.6 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.6 last_scrub_stamp 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.6 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:01.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.6") | .last_scrub_stamp' 2026-03-08T23:24:01.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:56.130109+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:01.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.6 last_scrub_stamp 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.6 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:02.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.6") | .last_scrub_stamp' 2026-03-08T23:24:03.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:21:56.130109+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:03.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:24:04.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:24:04.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:04.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.6 last_scrub_stamp 2026-03-08T23:24:04.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.6 2026-03-08T23:24:04.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:04.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:04.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.6") | .last_scrub_stamp' 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:59.661785+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.7 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.7 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.7 last_scrub_stamp 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.7 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:04.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.7") | .last_scrub_stamp' 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:24:00.619907+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6243: TEST_request_scrub_priority: for opg in $otherpgs $pg 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6245: TEST_request_scrub_priority: wait_for_scrub 1.1 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.1 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:24:04.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:24:04.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.1 last_scrub_stamp 2026-03-08T23:24:04.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.1 2026-03-08T23:24:04.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:24:04.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:24:04.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.1") | .last_scrub_stamp' 2026-03-08T23:24:04.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:23:53.675566+0000 '>' 2026-03-08T23:23:30.125014+0000 2026-03-08T23:24:04.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:24:04.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6249: TEST_request_scrub_priority: grep 'log_channel.*scrub ok' td/osd-scrub-repair/osd.0.log 2026-03-08T23:24:04.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6249: TEST_request_scrub_priority: sed 's/.*[[]DBG[]]//' 2026-03-08T23:24:04.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6249: TEST_request_scrub_priority: head -1 2026-03-08T23:24:04.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6249: TEST_request_scrub_priority: grep -q 1.1 2026-03-08T23:24:04.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:6249: TEST_request_scrub_priority: grep -v purged_snaps 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:24:04.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:24:04.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:24:04.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:24:04.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:24:04.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:24:04.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:24:04.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:24:04.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:24:04.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:04.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:24:04.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:24:04.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:04.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:24:04.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:24:04.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:24:04.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:24:04.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:24:04.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:24:04.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:24:04.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:24:04.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:24:04.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:24:04.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:24:04.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:24:04.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:24:04.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:04.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:24:04.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:24:04.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:04.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:24:04.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:24:04.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:24:04.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:24:04.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:24:04.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:24:04.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_scrub_warning td/osd-scrub-repair 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5842: TEST_scrub_warning: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5843: TEST_scrub_warning: local poolname=psr_pool 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5844: TEST_scrub_warning: local objname=POBJ 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5845: TEST_scrub_warning: local scrubs=5 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5846: TEST_scrub_warning: local deep_scrubs=5 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5847: TEST_scrub_warning: local i1_day=86400 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5848: TEST_scrub_warning: calc 86400 '*' 7 2026-03-08T23:24:04.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1589: calc: awk 'BEGIN{print 86400 * 7}' 2026-03-08T23:24:04.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5848: TEST_scrub_warning: local i7_days=604800 2026-03-08T23:24:04.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5849: TEST_scrub_warning: calc 86400 '*' 14 2026-03-08T23:24:04.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1589: calc: awk 'BEGIN{print 86400 * 14}' 2026-03-08T23:24:04.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5849: TEST_scrub_warning: local i14_days=1209600 2026-03-08T23:24:04.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5850: TEST_scrub_warning: local overdue=0.5 2026-03-08T23:24:04.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5851: TEST_scrub_warning: calc 604800 + 86400 + '(' 604800 '*' 0.5 ')' 2026-03-08T23:24:04.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1589: calc: awk 'BEGIN{print 604800 + 86400 + ( 604800 * 0.5 )}' 2026-03-08T23:24:04.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5851: TEST_scrub_warning: local conf_overdue_seconds=993600 2026-03-08T23:24:04.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5852: TEST_scrub_warning: calc 1209600 + 86400 + '(' 1209600 '*' 0.5 ')' 2026-03-08T23:24:04.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1589: calc: awk 'BEGIN{print 1209600 + 86400 + ( 1209600 * 0.5 )}' 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5852: TEST_scrub_warning: local pool_overdue_seconds=1900800 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5854: TEST_scrub_warning: run_mon td/osd-scrub-repair a --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:24:04.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:04.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=1 --mon_allow_pool_size_one=true 2026-03-08T23:24:04.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:24:04.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:24:04.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:24:04.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:24:04.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:24:04.696 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:24:04.696 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.697 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:24:04.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:24:04.698 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:24:04.765 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:24:04.766 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.766 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.766 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:24:04.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:24:04.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5855: TEST_scrub_warning: run_mgr td/osd-scrub-repair x --mon_warn_pg_not_scrubbed_ratio=0.5 --mon_warn_pg_not_deep_scrubbed_ratio=0.5 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:24:04.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:24:04.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:24:04.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:24:04.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:04.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:04.929 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:04.929 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.929 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:04.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:24:04.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mon_warn_pg_not_scrubbed_ratio=0.5 --mon_warn_pg_not_deep_scrubbed_ratio=0.5 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5856: TEST_scrub_warning: run_osd td/osd-scrub-repair 0 --osd_scrub_backoff_ratio=0 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:24:04.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:24:04.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:24:04.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:04.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:04.950 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+=--osd_scrub_backoff_ratio=0 2026-03-08T23:24:04.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:24:04.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:24:04.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48 2026-03-08T23:24:04.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48' 2026-03-08T23:24:04.958 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48 2026-03-08T23:24:04.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:24:04.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAUBa5pswpOOhAAaZD2UDOenYXQK9oZaTm51w== 2026-03-08T23:24:04.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAUBa5pswpOOhAAaZD2UDOenYXQK9oZaTm51w=="}' 2026-03-08T23:24:04.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:24:05.066 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:24:05.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:24:05.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_backoff_ratio=0 --mkfs --key AQAUBa5pswpOOhAAaZD2UDOenYXQK9oZaTm51w== --osd-uuid fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48 2026-03-08T23:24:05.096 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:05.100+0000 7f80ad6ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:05.098 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:05.100+0000 7f80ad6ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:05.099 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:05.100+0000 7f80ad6ec8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:05.099 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:05.100+0000 7f80ad6ec8c0 -1 bdev(0x555cc4e68c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:24:05.099 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:05.100+0000 7f80ad6ec8c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:24:07.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:24:07.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:24:07.355 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:24:07.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:24:07.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:24:07.486 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:24:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:24:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_backoff_ratio=0 2026-03-08T23:24:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:24:07.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:24:07.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:24:07.501 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:07.504+0000 7fc6493048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:07.503 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:07.504+0000 7fc6493048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:07.504 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:07.508+0000 7fc6493048c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:07.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:07.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:24:08.464 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:08.468+0000 7fc6493048c0 -1 Falling back to public interface 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:08.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:08.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:24:09.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:09.428+0000 7fc6493048c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:09.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:10.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:11.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:11.330 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3482110110,v1:127.0.0.1:6803/3482110110] [v2:127.0.0.1:6804/3482110110,v1:127.0.0.1:6805/3482110110] exists,up fc3c2f8d-c14d-4dfb-9516-c2db1acbcc48 2026-03-08T23:24:11.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:24:11.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:24:11.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:24:11.331 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: expr 5 + 5 2026-03-08T23:24:11.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: seq 1 10 2026-03-08T23:24:11.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:11.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-1 1 1 2026-03-08T23:24:11.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-1 1 1 2026-03-08T23:24:11.519 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-1' created 2026-03-08T23:24:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:12.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:12.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:12.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:12.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:12.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:12.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:12.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836482 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836482 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836482' 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836482 2026-03-08T23:24:12.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:12.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:12.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836482 2026-03-08T23:24:12.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:12.851 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836482 2026-03-08T23:24:12.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836482 2026-03-08T23:24:12.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836482' 2026-03-08T23:24:12.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:13.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836481 -lt 21474836482 2026-03-08T23:24:13.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:14.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:14.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:14.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836482 2026-03-08T23:24:14.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:14.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:14.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:24:14.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:14.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:14.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:14.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:14.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:14.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:14.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:14.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:24:14.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:14.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:14.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:24:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 1 = 1 ']' 2026-03-08T23:24:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5864: TEST_scrub_warning: ceph osd pool set psr_pool-1 scrub_max_interval 1209600 2026-03-08T23:24:14.933 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 scrub_max_interval to 1209600 2026-03-08T23:24:14.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 1 = 6 ']' 2026-03-08T23:24:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-2 1 1 2026-03-08T23:24:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-2 1 1 2026-03-08T23:24:15.142 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-2' created 2026-03-08T23:24:15.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:16.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:16.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:16.157 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:16.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:16.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:16.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:16.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:16.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836484 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836484 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836484' 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836484 2026-03-08T23:24:16.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:16.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:16.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836484 2026-03-08T23:24:16.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:16.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836484 2026-03-08T23:24:16.467 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836484 2026-03-08T23:24:16.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836484' 2026-03-08T23:24:16.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:16.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836484 2026-03-08T23:24:16.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:17.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:17.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:17.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836484 2026-03-08T23:24:17.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:17.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:17.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 2 == 0 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:17.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:18.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=2 2026-03-08T23:24:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:18.155 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:18.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 2 = 2 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 2 = 1 ']' 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 2 = 6 ']' 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-3 1 1 2026-03-08T23:24:18.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-3 1 1 2026-03-08T23:24:18.564 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-3' created 2026-03-08T23:24:18.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:19.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:19.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:19.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:19.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:19.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:19.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:19.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:19.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:24:19.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:24:19.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:24:19.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:19.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:24:19.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:19.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:19.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:24:19.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:19.931 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:24:19.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:24:19.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:24:19.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:20.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:24:20.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:21.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:21.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:21.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836485 2026-03-08T23:24:21.441 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:21.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:21.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:21.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 3 == 0 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:21.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:21.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=3 2026-03-08T23:24:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:21.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:21.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:21.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 3 = 3 2026-03-08T23:24:21.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:21.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:21.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 3 = 1 ']' 2026-03-08T23:24:21.995 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 3 = 6 ']' 2026-03-08T23:24:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-4 1 1 2026-03-08T23:24:21.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-4 1 1 2026-03-08T23:24:22.174 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-4' created 2026-03-08T23:24:22.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:23.188 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:23.189 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:23.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:23.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:23.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:23.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:23.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:23.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:23.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:23.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:24:23.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:23.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:23.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:24:23.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:23.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:24:23.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:24:23.489 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836487 2026-03-08T23:24:23.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:24:23.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:24.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:24.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:24.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:24:24.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:25.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:24:25.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:25.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836487 -lt 21474836487 2026-03-08T23:24:25.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:25.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:25.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:26.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:24:26.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:26.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:26.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:24:26.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:26.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:26.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:26.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:24:26.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:26.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:26.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 4 = 1 ']' 2026-03-08T23:24:26.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:26.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 4 = 6 ']' 2026-03-08T23:24:26.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:26.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-5 1 1 2026-03-08T23:24:26.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-5 1 1 2026-03-08T23:24:26.738 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-5' created 2026-03-08T23:24:26.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:27.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:27.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:27.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:27.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:27.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:27.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:27.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:28.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:28.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:28.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:28.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:28.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836489 2026-03-08T23:24:28.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836489 2026-03-08T23:24:28.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836489' 2026-03-08T23:24:28.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:28.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836489 2026-03-08T23:24:28.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:28.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:28.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836489 2026-03-08T23:24:28.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:28.082 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836489 2026-03-08T23:24:28.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836489 2026-03-08T23:24:28.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836489' 2026-03-08T23:24:28.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:28.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836489 2026-03-08T23:24:28.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:29.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:29.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:29.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836489 2026-03-08T23:24:29.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:30.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:24:30.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:30.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836489 -lt 21474836489 2026-03-08T23:24:30.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:30.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:30.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 5 == 0 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:30.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:30.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=5 2026-03-08T23:24:30.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:30.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:30.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:31.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 5 = 5 2026-03-08T23:24:31.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:31.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:31.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 5 = 1 ']' 2026-03-08T23:24:31.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 5 = 6 ']' 2026-03-08T23:24:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-6 1 1 2026-03-08T23:24:31.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-6 1 1 2026-03-08T23:24:31.324 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-6' created 2026-03-08T23:24:31.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:32.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:32.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:32.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:32.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:32.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:32.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836491 2026-03-08T23:24:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836491 2026-03-08T23:24:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836491' 2026-03-08T23:24:32.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:32.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836491 2026-03-08T23:24:32.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:32.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:32.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836491 2026-03-08T23:24:32.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:32.631 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836491 2026-03-08T23:24:32.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836491 2026-03-08T23:24:32.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836491' 2026-03-08T23:24:32.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:32.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836491 2026-03-08T23:24:32.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:33.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:33.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:33.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836491 -lt 21474836491 2026-03-08T23:24:33.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:33.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:33.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:34.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 6 == 0 2026-03-08T23:24:34.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:34.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:34.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:34.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:34.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:34.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:34.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:34.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=6 2026-03-08T23:24:34.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:34.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:34.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:34.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 6 = 6 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 6 = 1 ']' 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 6 = 6 ']' 2026-03-08T23:24:34.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5868: TEST_scrub_warning: ceph osd pool set psr_pool-6 deep_scrub_interval 1209600 2026-03-08T23:24:34.711 INFO:tasks.workunit.client.0.vm03.stderr:set pool 6 deep_scrub_interval to 1209600 2026-03-08T23:24:34.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:34.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-7 1 1 2026-03-08T23:24:34.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-7 1 1 2026-03-08T23:24:34.920 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-7' created 2026-03-08T23:24:34.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:35.935 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:35.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:35.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:35.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:35.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:36.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:36.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:36.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:36.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:36.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836493 2026-03-08T23:24:36.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836493 2026-03-08T23:24:36.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493' 2026-03-08T23:24:36.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:36.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836493 2026-03-08T23:24:36.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:36.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:36.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836493 2026-03-08T23:24:36.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:36.239 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836493 2026-03-08T23:24:36.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836493 2026-03-08T23:24:36.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836493' 2026-03-08T23:24:36.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:36.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836491 -lt 21474836493 2026-03-08T23:24:36.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:37.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:37.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:37.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836493 2026-03-08T23:24:37.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:37.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:37.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 7 == 0 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:37.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:37.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=7 2026-03-08T23:24:37.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:37.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:37.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:38.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 7 = 7 2026-03-08T23:24:38.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:38.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:38.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 7 = 1 ']' 2026-03-08T23:24:38.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 7 = 6 ']' 2026-03-08T23:24:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-8 1 1 2026-03-08T23:24:38.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-8 1 1 2026-03-08T23:24:38.337 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-8' created 2026-03-08T23:24:38.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:39.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:39.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:39.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:39.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836494 2026-03-08T23:24:39.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836494 2026-03-08T23:24:39.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494' 2026-03-08T23:24:39.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:39.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836494 2026-03-08T23:24:39.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:39.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:39.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836494 2026-03-08T23:24:39.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:39.673 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836494 2026-03-08T23:24:39.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836494 2026-03-08T23:24:39.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836494' 2026-03-08T23:24:39.674 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:39.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836494 2026-03-08T23:24:39.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:40.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:40.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836494 2026-03-08T23:24:41.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:42.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:24:42.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:42.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836494 2026-03-08T23:24:42.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:42.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:42.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 8 == 0 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:42.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:42.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=8 2026-03-08T23:24:42.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:42.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:42.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:42.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 8 = 8 2026-03-08T23:24:42.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:42.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:42.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 8 = 1 ']' 2026-03-08T23:24:42.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:42.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 8 = 6 ']' 2026-03-08T23:24:42.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:42.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-9 1 1 2026-03-08T23:24:42.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-9 1 1 2026-03-08T23:24:42.938 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-9' created 2026-03-08T23:24:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:43.955 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:43.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:43.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:43.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:44.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:44.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:44.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:44.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:44.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T23:24:44.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:44.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:44.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T23:24:44.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:44.260 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836496 2026-03-08T23:24:44.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T23:24:44.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T23:24:44.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:44.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836496 2026-03-08T23:24:44.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:45.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:45.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:45.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T23:24:45.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:45.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:45.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 9 == 0 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:45.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:45.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=9 2026-03-08T23:24:45.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:45.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:45.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:46.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 9 = 9 2026-03-08T23:24:46.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:46.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:46.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 9 = 1 ']' 2026-03-08T23:24:46.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:46.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 9 = 6 ']' 2026-03-08T23:24:46.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5858: TEST_scrub_warning: for i in $(seq 1 $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:46.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5860: TEST_scrub_warning: create_pool psr_pool-10 1 1 2026-03-08T23:24:46.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create psr_pool-10 1 1 2026-03-08T23:24:46.330 INFO:tasks.workunit.client.0.vm03.stderr:pool 'psr_pool-10' created 2026-03-08T23:24:46.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5861: TEST_scrub_warning: wait_for_clean 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:24:47.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:24:47.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:24:47.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:47.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:47.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:47.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:47.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:47.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836498 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836498 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836498' 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836498 2026-03-08T23:24:47.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:47.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:47.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836498 2026-03-08T23:24:47.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:47.668 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836498 2026-03-08T23:24:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836498 2026-03-08T23:24:47.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836498' 2026-03-08T23:24:47.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:47.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836498 2026-03-08T23:24:47.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:48.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:48.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:48.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836498 2026-03-08T23:24:48.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:49.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:24:49.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:50.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836498 -lt 21474836498 2026-03-08T23:24:50.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:24:50.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:50.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 10 == 0 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:24:50.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:24:50.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:24:50.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=10 2026-03-08T23:24:50.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:24:50.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:24:50.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:24:50.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 10 = 10 2026-03-08T23:24:50.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:24:50.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:24:50.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5862: TEST_scrub_warning: '[' 10 = 1 ']' 2026-03-08T23:24:50.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:50.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5866: TEST_scrub_warning: '[' 10 = 6 ']' 2026-03-08T23:24:50.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5873: TEST_scrub_warning: local primary=0 2026-03-08T23:24:50.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5875: TEST_scrub_warning: ceph osd set noscrub 2026-03-08T23:24:50.927 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:24:50.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5876: TEST_scrub_warning: ceph osd set nodeep-scrub 2026-03-08T23:24:51.135 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:24:51.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5877: TEST_scrub_warning: ceph config set global osd_scrub_interval_randomize_ratio 0 2026-03-08T23:24:51.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5878: TEST_scrub_warning: ceph config set global osd_deep_scrub_randomize_ratio 0 2026-03-08T23:24:51.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5879: TEST_scrub_warning: ceph config set global osd_scrub_max_interval 604800 2026-03-08T23:24:51.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5880: TEST_scrub_warning: ceph config set global osd_deep_scrub_interval 604800 2026-03-08T23:24:51.802 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: seq 1 5 2026-03-08T23:24:51.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: for i in $(seq 1 $scrubs) 2026-03-08T23:24:51.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5885: TEST_scrub_warning: '[' 1 = 1 ']' 2026-03-08T23:24:51.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5887: TEST_scrub_warning: overdue_seconds=1900800 2026-03-08T23:24:51.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: expr 1900800 + 100 2026-03-08T23:24:51.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: ceph tell 1.0 schedule-scrub 1900900 2026-03-08T23:24:51.869 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:51.869 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:24:51.869 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:51.869 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-14T23:21:31.878140+0000" 2026-03-08T23:24:51.869 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:51.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: for i in $(seq 1 $scrubs) 2026-03-08T23:24:51.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5885: TEST_scrub_warning: '[' 2 = 1 ']' 2026-03-08T23:24:51.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5889: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:51.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: expr 993600 + 200 2026-03-08T23:24:51.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: ceph tell 2.0 schedule-scrub 993800 2026-03-08T23:24:51.949 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:51.949 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:24:51.949 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:51.949 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:19:51.957971+0000" 2026-03-08T23:24:51.949 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:51.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: for i in $(seq 1 $scrubs) 2026-03-08T23:24:51.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5885: TEST_scrub_warning: '[' 3 = 1 ']' 2026-03-08T23:24:51.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5889: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:51.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: expr 993600 + 300 2026-03-08T23:24:51.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: ceph tell 3.0 schedule-scrub 993900 2026-03-08T23:24:52.029 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.029 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:24:52.029 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.029 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:18:12.037726+0000" 2026-03-08T23:24:52.029 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: for i in $(seq 1 $scrubs) 2026-03-08T23:24:52.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5885: TEST_scrub_warning: '[' 4 = 1 ']' 2026-03-08T23:24:52.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5889: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: expr 993600 + 400 2026-03-08T23:24:52.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: ceph tell 4.0 schedule-scrub 994000 2026-03-08T23:24:52.116 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.116 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:24:52.116 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.116 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:16:32.125058+0000" 2026-03-08T23:24:52.116 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5883: TEST_scrub_warning: for i in $(seq 1 $scrubs) 2026-03-08T23:24:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5885: TEST_scrub_warning: '[' 5 = 1 ']' 2026-03-08T23:24:52.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5889: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: expr 993600 + 500 2026-03-08T23:24:52.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5891: TEST_scrub_warning: ceph tell 5.0 schedule-scrub 994100 2026-03-08T23:24:52.194 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.194 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:24:52.194 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.194 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:14:52.202611+0000" 2026-03-08T23:24:52.194 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.204 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.205 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: expr 5 + 5 2026-03-08T23:24:52.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: seq 6 10 2026-03-08T23:24:52.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: for i in $(seq $(expr $scrubs + 1) $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:52.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: '[' 6 = 6 ']' 2026-03-08T23:24:52.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5898: TEST_scrub_warning: overdue_seconds=1900800 2026-03-08T23:24:52.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: expr 1900800 + 600 2026-03-08T23:24:52.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: ceph tell 6.0 schedule-deep-scrub 1901400 2026-03-08T23:24:52.273 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.273 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:24:52.273 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.273 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-14T23:13:12.281727+0000" 2026-03-08T23:24:52.273 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: for i in $(seq $(expr $scrubs + 1) $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:52.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: '[' 7 = 6 ']' 2026-03-08T23:24:52.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5900: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: expr 993600 + 700 2026-03-08T23:24:52.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: ceph tell 7.0 schedule-deep-scrub 994300 2026-03-08T23:24:52.347 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.347 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:24:52.347 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.347 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:11:32.356268+0000" 2026-03-08T23:24:52.347 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: for i in $(seq $(expr $scrubs + 1) $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:52.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: '[' 8 = 6 ']' 2026-03-08T23:24:52.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5900: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: expr 993600 + 800 2026-03-08T23:24:52.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: ceph tell 8.0 schedule-deep-scrub 994400 2026-03-08T23:24:52.425 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.425 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:24:52.425 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.425 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:09:52.434002+0000" 2026-03-08T23:24:52.425 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: for i in $(seq $(expr $scrubs + 1) $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:52.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: '[' 9 = 6 ']' 2026-03-08T23:24:52.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5900: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: expr 993600 + 900 2026-03-08T23:24:52.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: ceph tell 9.0 schedule-deep-scrub 994500 2026-03-08T23:24:52.502 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.502 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:24:52.502 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.502 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:08:12.511142+0000" 2026-03-08T23:24:52.502 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5894: TEST_scrub_warning: for i in $(seq $(expr $scrubs + 1) $(expr $scrubs + $deep_scrubs)) 2026-03-08T23:24:52.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: expr 5 + 1 2026-03-08T23:24:52.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5896: TEST_scrub_warning: '[' 10 = 6 ']' 2026-03-08T23:24:52.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5900: TEST_scrub_warning: overdue_seconds=993600 2026-03-08T23:24:52.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: expr 993600 + 1000 2026-03-08T23:24:52.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5902: TEST_scrub_warning: ceph tell 10.0 schedule-deep-scrub 994600 2026-03-08T23:24:52.582 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:24:52.582 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:24:52.582 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:24:52.582 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-25T11:06:32.590825+0000" 2026-03-08T23:24:52.582 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:24:52.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5904: TEST_scrub_warning: flush_pg_stats 2026-03-08T23:24:52.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:24:52.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:24:52.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:24:52.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:24:52.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:24:52.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:24:52.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836500 2026-03-08T23:24:52.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836500 2026-03-08T23:24:52.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500' 2026-03-08T23:24:52.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:24:52.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836500 2026-03-08T23:24:52.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:24:52.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:24:52.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836500 2026-03-08T23:24:52.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:24:52.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836500 2026-03-08T23:24:52.830 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836500 2026-03-08T23:24:52.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836500' 2026-03-08T23:24:52.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:52.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836499 -lt 21474836500 2026-03-08T23:24:52.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:24:53.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:24:53.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:24:54.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836500 2026-03-08T23:24:54.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5906: TEST_scrub_warning: ceph health 2026-03-08T23:24:54.320 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_WARN noscrub,nodeep-scrub flag(s) set; 5 pgs not deep-scrubbed in time; 10 pgs not scrubbed in time; 10 pool(s) do not have an application enabled; 10 pool(s) have no replicas configured; OSD count 1 < osd_pool_default_size 3 2026-03-08T23:24:54.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5907: TEST_scrub_warning: ceph health detail 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:HEALTH_WARN noscrub,nodeep-scrub flag(s) set; 5 pgs not deep-scrubbed in time; 10 pgs not scrubbed in time; 10 pool(s) do not have an application enabled; 10 pool(s) have no replicas configured; OSD count 1 < osd_pool_default_size 3 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] OSDMAP_FLAGS: noscrub,nodeep-scrub flag(s) set 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] PG_NOT_DEEP_SCRUBBED: 5 pgs not deep-scrubbed in time 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 10.0 not deep-scrubbed since 2026-02-25T11:06:32.590825+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 9.0 not deep-scrubbed since 2026-02-25T11:08:12.511142+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 8.0 not deep-scrubbed since 2026-02-25T11:09:52.434002+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 7.0 not deep-scrubbed since 2026-02-25T11:11:32.356268+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 6.0 not deep-scrubbed since 2026-02-14T23:13:12.281727+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] PG_NOT_SCRUBBED: 10 pgs not scrubbed in time 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 10.0 not scrubbed since 2026-02-25T11:06:32.590825+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 9.0 not scrubbed since 2026-02-25T11:08:12.511142+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 8.0 not scrubbed since 2026-02-25T11:09:52.434002+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 7.0 not scrubbed since 2026-02-25T11:11:32.356268+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 6.0 not scrubbed since 2026-02-14T23:13:12.281727+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 5.0 not scrubbed since 2026-02-25T11:14:52.202611+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 4.0 not scrubbed since 2026-02-25T11:16:32.125058+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 3.0 not scrubbed since 2026-02-25T11:18:12.037726+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 2.0 not scrubbed since 2026-02-25T11:19:51.957971+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pg 1.0 not scrubbed since 2026-02-14T23:21:31.878140+0000 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] POOL_APP_NOT_ENABLED: 10 pool(s) do not have an application enabled 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-1' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-2' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-3' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-4' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-5' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-6' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-7' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-8' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-9' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: application not enabled on pool 'psr_pool-10' 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: use 'ceph osd pool application enable ', where is 'cephfs', 'rbd', 'rgw', or freeform for custom applications. 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] POOL_NO_REDUNDANCY: 10 pool(s) have no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-1' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-2' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-3' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-4' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-5' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-6' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-7' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-8' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-9' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout: pool 'psr_pool-10' has no replicas configured 2026-03-08T23:24:54.509 INFO:tasks.workunit.client.0.vm03.stdout:[WRN] TOO_FEW_OSDS: OSD count 1 < osd_pool_default_size 3 2026-03-08T23:24:54.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5908: TEST_scrub_warning: ceph health 2026-03-08T23:24:54.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5908: TEST_scrub_warning: grep -q ' pgs not deep-scrubbed in time' 2026-03-08T23:24:54.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5909: TEST_scrub_warning: ceph health 2026-03-08T23:24:54.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5909: TEST_scrub_warning: grep -q ' pgs not scrubbed in time' 2026-03-08T23:24:54.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5914: TEST_scrub_warning: ceph health detail 2026-03-08T23:24:54.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5914: TEST_scrub_warning: wc -l 2026-03-08T23:24:54.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5914: TEST_scrub_warning: grep 'not scrubbed since' 2026-03-08T23:24:55.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5914: TEST_scrub_warning: COUNT=10 2026-03-08T23:24:55.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5916: TEST_scrub_warning: expr 5+5 2026-03-08T23:24:55.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5916: TEST_scrub_warning: (( 10 != 5 && 10 != 5+5 )) 2026-03-08T23:24:55.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5920: TEST_scrub_warning: ceph health detail 2026-03-08T23:24:55.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5920: TEST_scrub_warning: wc -l 2026-03-08T23:24:55.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5920: TEST_scrub_warning: grep 'not deep-scrubbed since' 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5920: TEST_scrub_warning: COUNT=5 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:5921: TEST_scrub_warning: '[' 5 '!=' 5 ']' 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:24:55.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:24:55.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:24:55.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:24:55.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:24:55.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:24:55.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:24:55.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:24:55.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:24:55.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:55.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:24:55.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:24:55.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:55.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:24:55.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:24:55.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:24:55.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:24:55.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:24:55.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:24:55.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:24:55.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:24:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:24:55.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:24:55.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:24:55.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:24:55.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:24:55.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:55.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:24:55.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:24:55.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:24:55.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:24:55.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:24:55.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:24:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:24:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:24:55.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:24:55.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:24:55.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:24:55.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:24:55.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_skip_non_repair_during_recovery td/osd-scrub-repair 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:136: TEST_skip_non_repair_during_recovery: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:137: TEST_skip_non_repair_during_recovery: local poolname=rbd 2026-03-08T23:24:55.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:139: TEST_skip_non_repair_during_recovery: run_mon td/osd-scrub-repair a --osd_pool_default_size=2 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:24:55.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair --osd_pool_default_size=2 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:55.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:24:55.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.466 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:24:55.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:24:55.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:24:55.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:24:55.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:24:55.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:24:55.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:24:55.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.531 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:24:55.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:24:55.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:140: TEST_skip_non_repair_during_recovery: run_mgr td/osd-scrub-repair x 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:24:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.702 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:55.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:24:55.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:141: TEST_skip_non_repair_during_recovery: run_osd td/osd-scrub-repair 0 --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:24:55.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:24:55.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:24:55.722 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:24:55.722 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:24:55.722 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true' 2026-03-08T23:24:55.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:24:55.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:24:55.728 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 28516d00-1abe-4fc5-89e0-7ad65eae8389 2026-03-08T23:24:55.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=28516d00-1abe-4fc5-89e0-7ad65eae8389 2026-03-08T23:24:55.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 28516d00-1abe-4fc5-89e0-7ad65eae8389' 2026-03-08T23:24:55.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:24:55.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBHBa5p61iQLBAAE2dBpv0pFuAkvfk4cDhPYQ== 2026-03-08T23:24:55.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBHBa5p61iQLBAAE2dBpv0pFuAkvfk4cDhPYQ=="}' 2026-03-08T23:24:55.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 28516d00-1abe-4fc5-89e0-7ad65eae8389 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:24:55.836 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:24:55.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:24:55.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true --mkfs --key AQBHBa5p61iQLBAAE2dBpv0pFuAkvfk4cDhPYQ== --osd-uuid 28516d00-1abe-4fc5-89e0-7ad65eae8389 2026-03-08T23:24:55.870 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:55.872+0000 7fbc659b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:55.872 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:55.876+0000 7fbc659b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:55.874 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:55.876+0000 7fbc659b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:55.874 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:55.876+0000 7fbc659b28c0 -1 bdev(0x55b4c5caac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:24:55.874 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:55.876+0000 7fbc659b28c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:24:58.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:24:58.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:24:58.129 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:24:58.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:24:58.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:24:58.241 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:24:58.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:24:58.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T23:24:58.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:24:58.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:24:58.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:24:58.257 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:58.260+0000 7fd9bfb548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:58.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:58.264+0000 7fd9bfb548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:58.261 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:58.264+0000 7fd9bfb548c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:58.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:58.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:24:59.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:24:59.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:24:59.573 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:24:59.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:24:59.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:24:59.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:24:59.708 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:24:59.712+0000 7fd9bfb548c0 -1 Falling back to public interface 2026-03-08T23:24:59.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:00.719 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:00.720+0000 7fd9bfb548c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:00.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:00.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:01.688 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:01.692+0000 7fd9bb30d640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:25:01.916 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:01.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:01.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:01.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:01.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:01.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:02.076 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3167637248,v1:127.0.0.1:6803/3167637248] [v2:127.0.0.1:6804/3167637248,v1:127.0.0.1:6805/3167637248] exists,up 28516d00-1abe-4fc5-89e0-7ad65eae8389 2026-03-08T23:25:02.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:02.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:02.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:144: TEST_skip_non_repair_during_recovery: run_osd td/osd-scrub-repair 1 --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:02.077 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true' 2026-03-08T23:25:02.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:25:02.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:25:02.080 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 fb3c0076-3216-42f3-ac17-947fb141fc74 2026-03-08T23:25:02.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fb3c0076-3216-42f3-ac17-947fb141fc74 2026-03-08T23:25:02.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 fb3c0076-3216-42f3-ac17-947fb141fc74' 2026-03-08T23:25:02.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:25:02.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBOBa5pS1rtBRAAYkOOGAIAELU9WemxvMRVrw== 2026-03-08T23:25:02.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBOBa5pS1rtBRAAYkOOGAIAELU9WemxvMRVrw=="}' 2026-03-08T23:25:02.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fb3c0076-3216-42f3-ac17-947fb141fc74 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:25:02.249 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:02.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:25:02.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true --mkfs --key AQBOBa5pS1rtBRAAYkOOGAIAELU9WemxvMRVrw== --osd-uuid fb3c0076-3216-42f3-ac17-947fb141fc74 2026-03-08T23:25:02.280 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:02.284+0000 7f91533d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:02.282 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:02.284+0000 7f91533d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:02.283 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:02.284+0000 7f91533d48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:02.283 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:02.288+0000 7f91533d48c0 -1 bdev(0x5602bd197c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:25:02.284 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:02.288+0000 7f91533d48c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:25:05.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:25:05.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:25:05.021 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:25:05.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:25:05.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:25:05.215 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:25:05.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:25:05.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_during_recovery=false --osd_repair_during_recovery=true --osd_debug_pretend_recovery_active=true 2026-03-08T23:25:05.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:25:05.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:25:05.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:25:05.236 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:05.240+0000 7fe00a0498c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:05.249 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:05.252+0000 7fe00a0498c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:05.251 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:05.252+0000 7fe00a0498c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:05.377 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:25:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:25:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:05.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:05.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:05.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:05.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:05.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:05.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:06.200 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:06.204+0000 7fe00a0498c0 -1 Falling back to public interface 2026-03-08T23:25:06.533 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:06.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:06.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:06.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:06.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:06.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:06.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:07.198 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:07.200+0000 7fe00a0498c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:25:07.713 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:07.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:07.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:07.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:07.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:07.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:07.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:08.920 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:08.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:08.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1033685426,v1:127.0.0.1:6811/1033685426] [v2:127.0.0.1:6812/1033685426,v1:127.0.0.1:6813/1033685426] exists,up fb3c0076-3216-42f3-ac17-947fb141fc74 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:147: TEST_skip_non_repair_during_recovery: create_rbd_pool 2026-03-08T23:25:09.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:535: create_rbd_pool: ceph osd pool delete rbd rbd --yes-i-really-really-mean-it 2026-03-08T23:25:09.262 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' does not exist 2026-03-08T23:25:09.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:536: create_rbd_pool: create_pool rbd 4 2026-03-08T23:25:09.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create rbd 4 2026-03-08T23:25:09.487 INFO:tasks.workunit.client.0.vm03.stderr:pool 'rbd' created 2026-03-08T23:25:09.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:25:10.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:537: create_rbd_pool: rbd pool init rbd 2026-03-08T23:25:10.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:148: TEST_skip_non_repair_during_recovery: wait_for_clean 2026-03-08T23:25:10.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:25:10.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:25:10.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:25:10.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:25:10.774 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:25:10.774 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:25:10.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:25:10.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:25:10.775 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:25:10.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:25:10.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:25:10.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:25:10.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:25:10.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:25:10.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:25:11.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:25:11.026 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:25:11.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:25:11.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:11.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:25:11.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:25:11.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:25:11.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:25:11.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:11.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:25:11.192 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:11.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:25:11.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:25:11.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:11.195 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:25:11.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:25:11.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:25:11.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:11.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:25:11.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:25:12.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:25:12.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:12.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836483 2026-03-08T23:25:12.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:12.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:25:12.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:12.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:25:12.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:25:12.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:12.523 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:25:12.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:25:12.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:25:12.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:25:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:25:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:25:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:25:12.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:25:12.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:25:12.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:25:12.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:25:12.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:25:12.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:25:12.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:25:13.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:25:13.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:25:13.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:13.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:150: TEST_skip_non_repair_during_recovery: add_something td/osd-scrub-repair rbd 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=rbd 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:25:13.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:25:13.466 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:25:13.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:25:13.674 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:25:13.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:25:13.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:25:13.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool rbd put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:151: TEST_skip_non_repair_during_recovery: get_not_primary rbd SOMETHING 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=rbd 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=SOMETHING 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary rbd SOMETHING 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=rbd 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T23:25:13.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:25:13.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:25:13.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map rbd SOMETHING 2026-03-08T23:25:13.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:151: TEST_skip_non_repair_during_recovery: scrub_and_not_schedule td/osd-scrub-repair rbd 0 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:155: scrub_and_not_schedule: local dir=td/osd-scrub-repair 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:156: scrub_and_not_schedule: local poolname=rbd 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:157: scrub_and_not_schedule: local osd=0 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:162: scrub_and_not_schedule: get_pg rbd SOMETHING 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=rbd 2026-03-08T23:25:14.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:25:14.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map rbd SOMETHING 2026-03-08T23:25:14.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:162: scrub_and_not_schedule: local pg=1.3 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:163: scrub_and_not_schedule: get_last_scrub_stamp 1.3 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:25:14.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:25:14.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:163: scrub_and_not_schedule: local last_scrub=2026-03-08T23:25:09.493898+0000 2026-03-08T23:25:14.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:164: scrub_and_not_schedule: ceph tell 1.3 schedule-scrub 2026-03-08T23:25:14.438 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:25:14.438 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:25:14.438 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:25:14.438 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:23:34.446750+0000" 2026-03-08T23:25:14.438 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:25:14.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i=0 )) 2026-03-08T23:25:14.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i < 3 )) 2026-03-08T23:25:14.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: get_last_scrub_stamp 1.3 2026-03-08T23:25:14.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:25:14.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:25:14.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:25:14.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:25:14.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: test 2026-03-08T23:25:09.493898+0000 '>' 2026-03-08T23:25:09.493898+0000 2026-03-08T23:25:14.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:173: scrub_and_not_schedule: sleep 1 2026-03-08T23:25:15.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i++ )) 2026-03-08T23:25:15.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i < 3 )) 2026-03-08T23:25:15.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: get_last_scrub_stamp 1.3 2026-03-08T23:25:15.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:25:15.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:25:15.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:25:15.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:25:15.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: test 2026-03-08T23:25:09.493898+0000 '>' 2026-03-08T23:25:09.493898+0000 2026-03-08T23:25:15.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:173: scrub_and_not_schedule: sleep 1 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i++ )) 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i < 3 )) 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: get_last_scrub_stamp 1.3 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.3 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:25:16.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.3") | .last_scrub_stamp' 2026-03-08T23:25:16.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:170: scrub_and_not_schedule: test 2026-03-08T23:25:09.493898+0000 '>' 2026-03-08T23:25:09.493898+0000 2026-03-08T23:25:16.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:173: scrub_and_not_schedule: sleep 1 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i++ )) 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:169: scrub_and_not_schedule: (( i < 3 )) 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:179: scrub_and_not_schedule: objectstore_tool td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:25:17.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:25:17.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING list-attrs 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:25:18.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING list-attrs 2026-03-08T23:25:18.604 INFO:tasks.workunit.client.0.vm03.stdout:_ 2026-03-08T23:25:18.604 INFO:tasks.workunit.client.0.vm03.stdout:snapset 2026-03-08T23:25:18.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:25:18.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:18.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:25:18.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:25:18.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:18.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:25:18.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:25:18.895 INFO:tasks.workunit.client.0.vm03.stderr:start osd.0 2026-03-08T23:25:18.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:25:18.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:25:18.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:25:18.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:25:18.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:25:18.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:25:18.912 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:18.912+0000 7febaec878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:18.912 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:18.912+0000 7febaec878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:18.913 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:18.916+0000 7febaec878c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:19.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:20.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:20.116+0000 7febaec878c0 -1 Falling back to public interface 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:20.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:20.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:21.133 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:21.136+0000 7febaec878c0 -1 osd.0 20 log_to_monitors true 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:21.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:21.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:22.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:22.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:22.932 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:22.932+0000 7feba5c37640 -1 osd.0 20 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:25:23.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:23.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:23.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:25:23.771 INFO:tasks.workunit.client.0.vm03.stderr:4 2026-03-08T23:25:23.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:23.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:23.929 INFO:tasks.workunit.client.0.vm03.stderr:osd.0 up in weight 1 up_from 24 up_thru 24 down_at 21 last_clean_interval [5,20) [v2:127.0.0.1:6802/2240657508,v1:127.0.0.1:6803/2240657508] [v2:127.0.0.1:6804/2240657508,v1:127.0.0.1:6805/2240657508] exists,up 28516d00-1abe-4fc5-89e0-7ad65eae8389 2026-03-08T23:25:23.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:25:23.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:25:23.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:25:24.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:25:24.154 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:25:24.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:25:24.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:24.154 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:25:24.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=103079215106 2026-03-08T23:25:24.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 103079215106 2026-03-08T23:25:24.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106' 2026-03-08T23:25:24.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:24.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:25:24.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672966 2026-03-08T23:25:24.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672966 2026-03-08T23:25:24.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-103079215106 1-42949672966' 2026-03-08T23:25:24.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:24.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-103079215106 2026-03-08T23:25:24.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:24.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:25:24.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-103079215106 2026-03-08T23:25:24.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:24.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=103079215106 2026-03-08T23:25:24.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 103079215106' 2026-03-08T23:25:24.315 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 103079215106 2026-03-08T23:25:24.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:24.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T23:25:24.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:25:25.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:25:25.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:25.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 103079215106 2026-03-08T23:25:25.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:25:26.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:25:26.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:26.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 103079215106 -lt 103079215106 2026-03-08T23:25:26.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:26.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672966 2026-03-08T23:25:26.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:26.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:25:26.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672966 2026-03-08T23:25:26.789 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:26.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672966 2026-03-08T23:25:26.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672966' 2026-03-08T23:25:26.790 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672966 2026-03-08T23:25:26.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:25:26.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672966 -lt 42949672966 2026-03-08T23:25:26.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:25:26.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:26.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 4 == 0 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:25:27.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:25:27.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:25:27.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=4 2026-03-08T23:25:27.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:25:27.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:27.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:27.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 4 = 4 2026-03-08T23:25:27.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:25:27.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:25:27.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:180: scrub_and_not_schedule: rados --pool rbd get SOMETHING td/osd-scrub-repair/COPY 2026-03-08T23:25:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:181: scrub_and_not_schedule: diff td/osd-scrub-repair/ORIGINAL td/osd-scrub-repair/COPY 2026-03-08T23:25:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:25:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:25:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:25:27.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:25:27.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:25:27.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:25:27.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:25:27.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:25:27.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:25:27.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:25:27.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:25:27.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:25:27.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:25:27.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:25:27.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:25:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:25:27.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:25:27.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:25:27.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:25:27.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:25:27.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:25:27.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:25:27.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:25:27.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:25:27.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:25:27.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:25:27.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:25:27.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:25:27.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:25:27.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:25:27.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:25:27.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:25:27.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:25:27.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:25:27.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:25:27.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:25:27.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:25:27.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:25:27.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:25:27.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:25:27.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:25:27.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:25:27.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:25:27.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:25:27.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:25:27.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:25:27.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:25:27.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_unfound_erasure_coded_appends td/osd-scrub-repair 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:903: TEST_unfound_erasure_coded_appends: unfound_erasure_coded td/osd-scrub-repair false 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:852: unfound_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:853: unfound_erasure_coded: local allow_overwrites=false 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:854: unfound_erasure_coded: local poolname=ecpool 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:855: unfound_erasure_coded: local payload=ABCDEF 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:857: unfound_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:25:27.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.680 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:27.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:25:27.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:25:27.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:25:27.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:25:27.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:25:27.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:25:27.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.709 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.710 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:25:27.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:25:27.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.780 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.781 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:25:27.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:25:27.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:858: unfound_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:25:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:25:27.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:27.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:25:27.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:25:27.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: seq 0 3 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 0 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:27.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:27.975 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:25:27.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:25:27.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:25:27.978 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 b4381769-00ee-4a0f-857b-1da8db42f534 2026-03-08T23:25:27.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b4381769-00ee-4a0f-857b-1da8db42f534 2026-03-08T23:25:27.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 b4381769-00ee-4a0f-857b-1da8db42f534' 2026-03-08T23:25:27.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:25:27.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBnBa5pC4B7OxAATYVKFHjgCW2Whle/DWSWbA== 2026-03-08T23:25:27.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBnBa5pC4B7OxAATYVKFHjgCW2Whle/DWSWbA=="}' 2026-03-08T23:25:27.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b4381769-00ee-4a0f-857b-1da8db42f534 -i td/osd-scrub-repair/0/new.json 2026-03-08T23:25:28.089 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:28.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:25:28.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBnBa5pC4B7OxAATYVKFHjgCW2Whle/DWSWbA== --osd-uuid b4381769-00ee-4a0f-857b-1da8db42f534 2026-03-08T23:25:28.122 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:28.124+0000 7fa6d37b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:28.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:28.128+0000 7fa6d37b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:28.126 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:28.128+0000 7fa6d37b28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:28.126 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:28.128+0000 7fa6d37b28c0 -1 bdev(0x55ecbaaf0c00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:25:28.126 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:28.128+0000 7fa6d37b28c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:25:30.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:25:30.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:25:30.402 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:25:30.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:25:30.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:25:30.513 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:25:30.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:25:30.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:25:30.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:25:30.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:25:30.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:25:30.530 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:30.532+0000 7fd6f0ab48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:30.532 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:30.536+0000 7fd6f0ab48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:30.535 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:30.536+0000 7fd6f0ab48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:30.704 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:30.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:30.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:31.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:31.980 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:31.984+0000 7fd6f0ab48c0 -1 Falling back to public interface 2026-03-08T23:25:32.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:32.951 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:32.956+0000 7fd6f0ab48c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:25:33.040 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:33.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:33.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:33.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:33.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:33.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:33.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:34.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:34.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:34.244 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:34.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:34.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:34.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:25:34.405 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2888384744,v1:127.0.0.1:6803/2888384744] [v2:127.0.0.1:6804/2888384744,v1:127.0.0.1:6805/2888384744] exists,up b4381769-00ee-4a0f-857b-1da8db42f534 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 1 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:34.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:25:34.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:25:34.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:25:34.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=45c8c453-207b-43ba-a181-00230be36052 2026-03-08T23:25:34.409 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 45c8c453-207b-43ba-a181-00230be36052 2026-03-08T23:25:34.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 45c8c453-207b-43ba-a181-00230be36052' 2026-03-08T23:25:34.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:25:34.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBuBa5pdTaUGRAARKZ3VJDjhmYgzCukIFFwiA== 2026-03-08T23:25:34.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBuBa5pdTaUGRAARKZ3VJDjhmYgzCukIFFwiA=="}' 2026-03-08T23:25:34.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 45c8c453-207b-43ba-a181-00230be36052 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:25:34.580 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:34.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:25:34.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBuBa5pdTaUGRAARKZ3VJDjhmYgzCukIFFwiA== --osd-uuid 45c8c453-207b-43ba-a181-00230be36052 2026-03-08T23:25:34.609 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:34.612+0000 7f6f4e0cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:34.611 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:34.612+0000 7f6f4e0cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:34.612 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:34.616+0000 7f6f4e0cf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:34.612 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:34.616+0000 7f6f4e0cf8c0 -1 bdev(0x561a78ba3c00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:25:34.612 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:34.616+0000 7f6f4e0cf8c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:25:36.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:25:36.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:25:36.881 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:25:36.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:25:36.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:25:37.082 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:25:37.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:25:37.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:25:37.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:25:37.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:25:37.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:25:37.101 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:37.100+0000 7fa7439108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:37.101 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:37.104+0000 7fa7439108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:37.103 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:37.104+0000 7fa7439108c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:37.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:25:37.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:37.259 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:37.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:37.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:37.805 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:37.808+0000 7fa7439108c0 -1 Falling back to public interface 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:38.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:38.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:39.026 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:39.028+0000 7fa7439108c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:25:39.584 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:39.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:39.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:39.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:39.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:39.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:39.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:40.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:25:40.920 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3506283943,v1:127.0.0.1:6811/3506283943] [v2:127.0.0.1:6812/3506283943,v1:127.0.0.1:6813/3506283943] exists,up 45c8c453-207b-43ba-a181-00230be36052 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 2 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:40.921 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:25:40.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:25:40.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:25:40.924 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb 2026-03-08T23:25:40.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb 2026-03-08T23:25:40.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb' 2026-03-08T23:25:40.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:25:40.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB0Ba5pL/E6OBAAxk0IdlXEmMWswBAiELKz6A== 2026-03-08T23:25:40.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB0Ba5pL/E6OBAAxk0IdlXEmMWswBAiELKz6A=="}' 2026-03-08T23:25:40.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb -i td/osd-scrub-repair/2/new.json 2026-03-08T23:25:41.098 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:41.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:25:41.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB0Ba5pL/E6OBAAxk0IdlXEmMWswBAiELKz6A== --osd-uuid e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb 2026-03-08T23:25:41.129 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:41.132+0000 7f0fbb0968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:41.130 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:41.132+0000 7f0fbb0968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:41.131 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:41.132+0000 7f0fbb0968c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:41.132 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:41.136+0000 7f0fbb0968c0 -1 bdev(0x557e3c3cdc00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:25:41.132 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:41.136+0000 7f0fbb0968c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:25:43.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:25:43.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:25:43.386 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:25:43.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:25:43.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:25:43.578 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:25:43.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:25:43.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:25:43.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:25:43.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:25:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:25:43.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:43.596+0000 7fcecc9a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:43.597 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:43.600+0000 7fcecc9a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:43.599 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:43.600+0000 7fcecc9a58c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:43.756 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:43.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:25:43.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:44.924 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:44.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:44.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:44.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:44.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:44.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:25:45.040 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:45.044+0000 7fcecc9a58c0 -1 Falling back to public interface 2026-03-08T23:25:45.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:46.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:25:46.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:46.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:46.516+0000 7fcecc9a58c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:25:47.270 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:47.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:47.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:47.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:47.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:47.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:25:47.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:48.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3684714249,v1:127.0.0.1:6819/3684714249] [v2:127.0.0.1:6820/3684714249,v1:127.0.0.1:6821/3684714249] exists,up e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 3 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:25:48.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:25:48.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:25:48.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:25:48.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:25:48.625 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 eb5310b2-47af-4260-8814-35685c2f6464 2026-03-08T23:25:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=eb5310b2-47af-4260-8814-35685c2f6464 2026-03-08T23:25:48.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 eb5310b2-47af-4260-8814-35685c2f6464' 2026-03-08T23:25:48.626 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:25:48.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQB8Ba5pK+VuJhAAeNXgSLrF8TK8Vg64Z8rMow== 2026-03-08T23:25:48.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQB8Ba5pK+VuJhAAeNXgSLrF8TK8Vg64Z8rMow=="}' 2026-03-08T23:25:48.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new eb5310b2-47af-4260-8814-35685c2f6464 -i td/osd-scrub-repair/3/new.json 2026-03-08T23:25:48.805 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:48.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T23:25:48.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQB8Ba5pK+VuJhAAeNXgSLrF8TK8Vg64Z8rMow== --osd-uuid eb5310b2-47af-4260-8814-35685c2f6464 2026-03-08T23:25:48.837 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:48.840+0000 7f36af03a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:48.839 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:48.840+0000 7f36af03a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:48.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:48.844+0000 7f36af03a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:48.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:48.844+0000 7f36af03a8c0 -1 bdev(0x55a4b8db3c00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:25:48.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:48.844+0000 7f36af03a8c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T23:25:51.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T23:25:51.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:25:51.353 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T23:25:51.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:25:51.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:25:51.545 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T23:25:51.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:25:51.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:25:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:25:51.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:25:51.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:25:51.560 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:51.564+0000 7f9c90a438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:51.561 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:51.564+0000 7f9c90a438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:51.563 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:51.564+0000 7f9c90a438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:51.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:25:51.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:52.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:52.516+0000 7f9c90a438c0 -1 Falling back to public interface 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:25:53.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:53.490 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:53.492+0000 7f9c90a438c0 -1 osd.3 0 log_to_monitors true 2026-03-08T23:25:54.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:54.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:54.047 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:25:54.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:25:54.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:54.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:25:54.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:25:54.586 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:25:54.588+0000 7f9c8c1fc640 -1 osd.3 0 waiting for initial osdmap 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:25:55.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/1507366407,v1:127.0.0.1:6827/1507366407] [v2:127.0.0.1:6828/1507366407,v1:127.0.0.1:6829/1507366407] exists,up eb5310b2-47af-4260-8814-35685c2f6464 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:863: unfound_erasure_coded: create_ec_pool ecpool false k=2 m=2 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=false 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:25:55.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=2 2026-03-08T23:25:55.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:25:55.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:25:55.927 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:25:55.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' false = true ']' 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:25:56.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:25:56.942 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:25:56.942 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:25:56.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:25:56.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:25:56.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:25:57.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:57.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:25:57.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T23:25:57.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T23:25:57.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T23:25:57.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:57.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:25:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T23:25:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T23:25:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965' 2026-03-08T23:25:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:25:57.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T23:25:57.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T23:25:57.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443' 2026-03-08T23:25:57.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:25:57.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:25:57.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345922 2026-03-08T23:25:57.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345922 2026-03-08T23:25:57.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443 3-85899345922' 2026-03-08T23:25:57.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:57.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T23:25:57.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:57.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:25:57.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T23:25:57.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:57.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T23:25:57.490 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T23:25:57.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T23:25:57.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:57.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T23:25:57.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:25:58.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:25:58.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:25:58.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T23:25:58.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:58.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T23:25:58.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:58.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:25:58.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T23:25:58.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:58.819 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T23:25:58.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T23:25:58.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T23:25:58.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:25:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T23:25:58.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:58.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T23:25:58.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:58.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:25:58.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T23:25:58.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:58.986 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T23:25:58.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T23:25:58.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T23:25:58.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:25:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509443 -lt 64424509443 2026-03-08T23:25:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:25:59.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345922 2026-03-08T23:25:59.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:25:59.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:25:59.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345922 2026-03-08T23:25:59.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:25:59.165 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345922 2026-03-08T23:25:59.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345922 2026-03-08T23:25:59.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345922' 2026-03-08T23:25:59.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:25:59.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345922 -lt 85899345922 2026-03-08T23:25:59.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:25:59.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:59.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:25:59.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:25:59.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:25:59.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:25:59.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:25:59.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:25:59.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:25:59.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:25:59.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:25:59.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:25:59.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:25:59.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:25:59.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:865: unfound_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:25:59.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:26:00.033 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:26:00.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:26:00.235 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:26:00.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:26:00.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:26:00.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:26:00.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:867: unfound_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T23:26:00.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T23:26:00.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:26:00.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:26:00.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:26:00.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:867: unfound_erasure_coded: local primary=1 2026-03-08T23:26:00.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T23:26:00.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:26:00.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: sed -e s/1// 2026-03-08T23:26:00.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T23:26:00.426 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:26:00.427 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:26:00.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:26:00.580 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:26:00.580 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:26:00.580 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T23:26:00.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2 3 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: osds=('0' '2' '3') 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: local -a osds 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:869: unfound_erasure_coded: local not_primary_first=0 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:870: unfound_erasure_coded: local not_primary_second=2 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:871: unfound_erasure_coded: local not_primary_third=3 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:876: unfound_erasure_coded: pids= 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:877: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 405544"' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 405544' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:878: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 405545"' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 405545' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:879: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 405547"' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 405547' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:880: unfound_erasure_coded: wait_background pids 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 405544 405545 405547' 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:26:00.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:26:00.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 405544 2026-03-08T23:26:00.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/405548: /' 2026-03-08T23:26:00.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:00.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:00.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/405550: /' 2026-03-08T23:26:00.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:00.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/405552: /' 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: remove 1#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:01.639 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: start osd.0 2026-03-08T23:26:01.641 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING remove 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: remove 2#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:01.879 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: start osd.2 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:26:01.882 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/o405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.886 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: remove 3#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:01.887 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: start osd.3 2026-03-08T23:26:01.921 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2026-03-08T23:26:01.656+0000 7f474b23e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2026-03-08T23:26:01.664+0000 7f474b23e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2026-03-08T23:26:01.664+0000 7f474b23e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 0 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2026-03-08T23:26:02.372+0000 7f474b23e8c0 -1 Falling back to public interface 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2026-03-08T23:26:03.748+0000 7f474b23e8c0 -1 osd.0 28 log_to_monitors true 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: 3 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2026-03-08T23:26:01.944+0000 7faed058d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2026-03-08T23:26:01.960+0000 7faed058d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2026-03-08T23:26:01.968+0000 7faed058d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 0 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2026-03-08T23:26:02.428+0000 7faed058d8c0 -1 Falling back to public interface 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: 1 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:05.970 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2026-03-08T23:26:03.428+0000 7faed058d8c0 -1 osd.3 28 log_to_monitors true 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: 3 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.971 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405550: osd.3 up in weight 1 up_from 33 up_thru 0 down_at 29 last_csd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2026-03-08T23:26:01.908+0000 7f0d4cced8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2026-03-08T23:26:01.924+0000 7f0d4cced8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2026-03-08T23:26:01.924+0000 7f0d4cced8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 0 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2026-03-08T23:26:02.392+0000 7f0d4cced8c0 -1 Falling back to public interface 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 1 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2026-03-08T23:26:03.412+0000 7f0d4cced8c0 -1 osd.2 28 log_to_monitors true 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:05.980 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: 3 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:05.981 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405552: osd.2 up in weight 1 up_from 33 up_thru 0 down_at 29 last_clean_interval [20,28) [v2:127.0.0.1:6826/2709621251,v1:127.0.0.1:6827/2709621251] [v2:127.0.0.1:6828/2709621251,v1:127.0.0.1:6829/2709621251] exists,up eb5310b2-47af-4260-8814-35685c2f6464 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: 1 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: 2 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: 3' 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066' 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:06.384 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:06.385 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T23:26:06.385 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T23:26:06.385 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672968' 2026-03-08T23:26:06.385 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:06.401 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph lean_interval [15,28) [v2:127.0.0.1:6818/3180859205,v1:127.0.0.1:6819/3180859205] [v2:127.0.0.1:6820/3180859205,v1:127.0.0.1:6821/3180859205] exists,up e0ecced4-c72b-4ba1-87bf-c89ca1f18dfb 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: 1 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: 2 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: 3' 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888067 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888067 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888067' 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672969 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672969 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888067 1-42949672969' 2026-03-08T23:26:06.402 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920771 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920771 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888067 1-42949672969 2-141733920771' 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920771 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920771 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888067 1-42949672969 2-141733920771 3-141733920771' 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-146028888067 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-146028888067 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888067 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 146028888067' 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: waiting osd.0 seq 146028888067 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888067 -lt 146028888067 2026-03-08T23:26:07.020 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672969 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672969 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672969 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672969' 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: waiting osd.1 seq 42949672969 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672969 -lt 42949672969 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-141733920771 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:26:07.021 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-141733920771 2026-03-08T23:26:07.113 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtell osd.2 flush_pg_stats 2026-03-08T23:26:07.113 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920770 2026-03-08T23:26:07.113 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920770 2026-03-08T23:26:07.113 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672968 2-141733920770' 2026-03-08T23:26:07.113 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920770 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920770 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888066 1-42949672968 2-141733920770 3-141733920770' 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-146028888066 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-146028888066 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 146028888066' 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: waiting osd.0 seq 146028888066 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888067 -lt 146028888066 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: waiting osd.1 seq 42949672968 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672969 -lt 42949672968 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-141733920770 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:26:07.114 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/cs.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: 4 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: osd.0 up in weight 1 up_from 34 up_thru 0 down_at 29 last_clean_interval [5,28) [v2:127.0.0.1:6802/961353422,v1:127.0.0.1:6803/961353422] [v2:127.0.0.1:6804/961353422,v1:127.0.0.1:6805/961353422] exists,up b4381769-00ee-4a0f-857b-1da8db42f534 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: 1 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: 2 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: 3' 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888068 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888068 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888068' 2026-03-08T23:26:07.257 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888068 1-42949672970' 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920772 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920772 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888068 1-42949672970 2-141733920772' 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920772 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920772 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-146028888068 1-42949672970 2-141733920772 3-141733920772' 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-146028888068 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-146028888068 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888068 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 146028888068' 2026-03-08T23:26:08.873 INFO:tasks.workunit.client.0.vm03.stderr:405548: waiting osd.0 seq 146028888068 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888067 -lt 146028888068 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888068 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:08.874 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405548: /homelone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 141733920770' 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: waiting osd.2 seq 141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920770 -lt 141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 141733920770' 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: waiting osd.3 seq 141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 141733920770 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:26:09.155 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920772 -lt 141733920770 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:26:09.156 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:405550: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:405550: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:26:09.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: waiting osd.1 seq 42949672970 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-141733920772 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:09.592 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 141733920772' 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: waiting osd.2 seq 141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920772 -lt 141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 141733920772' 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: waiting osd.3 seq 141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920772 -lt 141733920772 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:26:09.593 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:405548: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:26:10.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 405545 2026-03-08T23:26:10.096 INFO:tasks.workunit.client.0.vm03.stderr:test/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 141733920771' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: waiting osd.2 seq 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920770 -lt 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920770 -lt 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920772 -lt 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 141733920771' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: waiting osd.3 seq 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920772 -lt 141733920771 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:26:10.110 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:26:10.456 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:26:10.456 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:26:10.456 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:26:10.456 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:10.456 INFO:tasks.workunit.client.0.vm03.stderr:405552: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:405552: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 405547 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:881: unfound_erasure_coded: return_code=0 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:882: unfound_erasure_coded: '[' 0 -ne 0 ']' 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:887: unfound_erasure_coded: get_pg ecpool SOMETHING 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:26:10.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:26:10.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:887: unfound_erasure_coded: local pg=1.0 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:888: unfound_erasure_coded: repair 1.0 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:10.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:10.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:10.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T23:26:10.933 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0s0 on osd.1 to repair 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:10.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:11.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:25:55.932175+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:11.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:26:12.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:26:12.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:12.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:12.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:12.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:12.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:12.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:12.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:25:55.932175+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:12.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:13.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:13.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:13.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:25:55.932175+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:13.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:26:14.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:26:14.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:14.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:14.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:14.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:14.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:14.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:25:55.932175+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:14.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:15.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:15.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:25:55.932175+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:15.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:26:16.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:26:16.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:26:16.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:26:16.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:26:16.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:26:16.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:26:16.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:26:16.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:11.778286+0000 '>' 2026-03-08T23:25:55.932175+0000 2026-03-08T23:26:16.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:26:16.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:893: unfound_erasure_coded: seq 1 60 2026-03-08T23:26:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:893: unfound_erasure_coded: for f in `seq 1 60` 2026-03-08T23:26:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: ceph -s 2026-03-08T23:26:16.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: grep '1/1 objects unfound' 2026-03-08T23:26:17.187 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:26:17.187 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:26:17.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: break 2026-03-08T23:26:17.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:897: unfound_erasure_coded: ceph -s 2026-03-08T23:26:17.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:897: unfound_erasure_coded: grep '4 up' 2026-03-08T23:26:17.408 INFO:tasks.workunit.client.0.vm03.stdout: osd: 4 osds: 4 up (since 11s), 4 in (since 28s) 2026-03-08T23:26:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:898: unfound_erasure_coded: ceph -s 2026-03-08T23:26:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:898: unfound_erasure_coded: grep '4 in' 2026-03-08T23:26:17.636 INFO:tasks.workunit.client.0.vm03.stdout: osd: 4 osds: 4 up (since 11s), 4 in (since 28s) 2026-03-08T23:26:17.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:899: unfound_erasure_coded: ceph -s 2026-03-08T23:26:17.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:899: unfound_erasure_coded: grep '1/1 objects unfound' 2026-03-08T23:26:17.857 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:17.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:17.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:17.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:26:17.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:26:17.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:26:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:26:17.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:26:17.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:26:17.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:26:17.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:26:17.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:26:17.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:26:17.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:26:17.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:26:17.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:26:18.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:26:18.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:63: run: for func in $funcs 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:64: run: setup td/osd-scrub-repair 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-repair 2026-03-08T23:26:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:18.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:18.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:18.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:26:18.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:26:18.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:26:18.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:26:18.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:26:18.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:26:18.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:26:18.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:26:18.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:26:18.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:26:18.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:26:18.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:26:18.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:26:18.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:26:18.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:26:18.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:26:18.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:26:18.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-repair 2026-03-08T23:26:18.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:26:18.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.43024 2026-03-08T23:26:18.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:26:18.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:26:18.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:26:18.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-repair 1' TERM HUP INT 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:65: run: TEST_unfound_erasure_coded_overwrites td/osd-scrub-repair 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:907: TEST_unfound_erasure_coded_overwrites: '[' true = true ']' 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:908: TEST_unfound_erasure_coded_overwrites: unfound_erasure_coded td/osd-scrub-repair true 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:852: unfound_erasure_coded: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:853: unfound_erasure_coded: local allow_overwrites=true 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:854: unfound_erasure_coded: local poolname=ecpool 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:855: unfound_erasure_coded: local payload=ABCDEF 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:857: unfound_erasure_coded: run_mon td/osd-scrub-repair a 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-repair/a 2026-03-08T23:26:18.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-repair/a --run-dir=td/osd-scrub-repair 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.048 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:18.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-repair/a '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-repair/log --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:26:18.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:26:18.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:26:18.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:26:18.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:26:18.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:26:18.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:26:18.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:26:18.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:26:18.086 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:26:18.094 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:26:18.094 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.094 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.095 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:26:18.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:26:18.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get fsid 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:26:18.165 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:26:18.166 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.166 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.166 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.43024/ceph-mon.a.asok 2026-03-08T23:26:18.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:26:18.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.43024/ceph-mon.a.asok config get mon_host 2026-03-08T23:26:18.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:858: unfound_erasure_coded: run_mgr td/osd-scrub-repair x 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-repair/x 2026-03-08T23:26:18.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:26:18.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:26:18.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:18.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:18.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:18.344 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.344 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:18.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:26:18.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-repair/x '--log-file=td/osd-scrub-repair/$name.log' '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --run-dir=td/osd-scrub-repair '--pid-file=td/osd-scrub-repair/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:26:18.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: seq 0 3 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 0 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:26:18.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:18.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:26:18.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:18.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:18.368 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:18.368 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:18.368 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:26:18.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:26:18.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:26:18.373 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 8256f948-8a50-451a-be57-db2f5c1f156a 2026-03-08T23:26:18.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8256f948-8a50-451a-be57-db2f5c1f156a 2026-03-08T23:26:18.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 8256f948-8a50-451a-be57-db2f5c1f156a' 2026-03-08T23:26:18.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:26:18.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCaBa5pdtttFxAAACs5ICteJGbAdcgftyFhFA== 2026-03-08T23:26:18.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCaBa5pdtttFxAAACs5ICteJGbAdcgftyFhFA=="}' 2026-03-08T23:26:18.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8256f948-8a50-451a-be57-db2f5c1f156a -i td/osd-scrub-repair/0/new.json 2026-03-08T23:26:18.491 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:26:18.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/0/new.json 2026-03-08T23:26:18.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCaBa5pdtttFxAAACs5ICteJGbAdcgftyFhFA== --osd-uuid 8256f948-8a50-451a-be57-db2f5c1f156a 2026-03-08T23:26:18.525 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:18.528+0000 7f1ac01e18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:18.532 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:18.536+0000 7f1ac01e18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:18.534 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:18.536+0000 7f1ac01e18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:18.534 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:18.536+0000 7f1ac01e18c0 -1 bdev(0x55cdfab6ac00 td/osd-scrub-repair/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:26:18.534 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:18.536+0000 7f1ac01e18c0 -1 bluestore(td/osd-scrub-repair/0) _read_fsid unparsable uuid 2026-03-08T23:26:21.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/0/keyring 2026-03-08T23:26:21.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:26:21.072 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:26:21.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:26:21.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:26:21.293 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:26:21.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:26:21.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:21.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:26:21.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:26:21.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:26:21.312 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:21.312+0000 7fed6c36b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:21.316 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:21.320+0000 7fed6c36b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:21.318 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:21.320+0000 7fed6c36b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:21.486 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:21.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:21.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:22.517 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:22.520+0000 7fed6c36b8c0 -1 Falling back to public interface 2026-03-08T23:26:22.657 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:26:22.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:22.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:22.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:22.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:22.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:22.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:23.515 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:23.516+0000 7fed6c36b8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:23.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:24.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:24.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:24.712+0000 7fed67b24640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:25.210 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2634555092,v1:127.0.0.1:6803/2634555092] [v2:127.0.0.1:6804/2634555092,v1:127.0.0.1:6805/2634555092] exists,up 8256f948-8a50-451a-be57-db2f5c1f156a 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 1 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/1 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/1' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/1/journal' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:25.211 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:26:25.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/1 2026-03-08T23:26:25.214 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:26:25.215 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 85dc1d0c-8201-4553-920b-f112cecb4390 2026-03-08T23:26:25.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=85dc1d0c-8201-4553-920b-f112cecb4390 2026-03-08T23:26:25.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 85dc1d0c-8201-4553-920b-f112cecb4390' 2026-03-08T23:26:25.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:26:25.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQChBa5p9pz8DRAAXES1fZYKFdCRtatvjBdtZw== 2026-03-08T23:26:25.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQChBa5p9pz8DRAAXES1fZYKFdCRtatvjBdtZw=="}' 2026-03-08T23:26:25.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 85dc1d0c-8201-4553-920b-f112cecb4390 -i td/osd-scrub-repair/1/new.json 2026-03-08T23:26:25.408 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:26:25.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/1/new.json 2026-03-08T23:26:25.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQChBa5p9pz8DRAAXES1fZYKFdCRtatvjBdtZw== --osd-uuid 85dc1d0c-8201-4553-920b-f112cecb4390 2026-03-08T23:26:25.441 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:25.444+0000 7fa091a168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:25.442 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:25.444+0000 7fa091a168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:25.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:25.448+0000 7fa091a168c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:25.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:25.448+0000 7fa091a168c0 -1 bdev(0x55fda738bc00 td/osd-scrub-repair/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:26:25.444 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:25.448+0000 7fa091a168c0 -1 bluestore(td/osd-scrub-repair/1) _read_fsid unparsable uuid 2026-03-08T23:26:28.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/1/keyring 2026-03-08T23:26:28.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:26:28.202 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:26:28.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:26:28.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:26:28.415 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:26:28.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:26:28.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/1 --osd-journal=td/osd-scrub-repair/1/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:28.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:26:28.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:26:28.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:26:28.431 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:28.432+0000 7fdaeca598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:28.437 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:28.440+0000 7fdaeca598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:28.439 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:28.440+0000 7fdaeca598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:28.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:26:28.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:29.149 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:29.152+0000 7fdaeca598c0 -1 Falling back to public interface 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:29.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:26:29.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:30.363 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:30.364+0000 7fdaeca598c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:26:30.955 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:26:30.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:30.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:30.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:30.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:30.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:26:31.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:31.492 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:31.496+0000 7fdae8212640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:32.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1089448458,v1:127.0.0.1:6811/1089448458] [v2:127.0.0.1:6812/1089448458,v1:127.0.0.1:6813/1089448458] exists,up 85dc1d0c-8201-4553-920b-f112cecb4390 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 2 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:26:32.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:26:32.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:26:32.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:26:32.357 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 c91eaba1-015c-41de-9e74-419a3fe06dc7 2026-03-08T23:26:32.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=c91eaba1-015c-41de-9e74-419a3fe06dc7 2026-03-08T23:26:32.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 c91eaba1-015c-41de-9e74-419a3fe06dc7' 2026-03-08T23:26:32.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:26:32.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCoBa5pAj9xFhAAANYi+771r7hAwJvX6/zzxg== 2026-03-08T23:26:32.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCoBa5pAj9xFhAAANYi+771r7hAwJvX6/zzxg=="}' 2026-03-08T23:26:32.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new c91eaba1-015c-41de-9e74-419a3fe06dc7 -i td/osd-scrub-repair/2/new.json 2026-03-08T23:26:32.540 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:26:32.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/2/new.json 2026-03-08T23:26:32.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCoBa5pAj9xFhAAANYi+771r7hAwJvX6/zzxg== --osd-uuid c91eaba1-015c-41de-9e74-419a3fe06dc7 2026-03-08T23:26:32.571 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:32.572+0000 7fddf95e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:32.573 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:32.576+0000 7fddf95e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:32.574 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:32.576+0000 7fddf95e08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:32.574 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:32.576+0000 7fddf95e08c0 -1 bdev(0x5569e37d7c00 td/osd-scrub-repair/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:26:32.575 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:32.576+0000 7fddf95e08c0 -1 bluestore(td/osd-scrub-repair/2) _read_fsid unparsable uuid 2026-03-08T23:26:34.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/2/keyring 2026-03-08T23:26:34.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:26:34.838 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:26:34.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:26:34.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:26:35.056 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:26:35.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:26:35.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:35.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:26:35.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:26:35.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:26:35.072 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:35.072+0000 7f2990ba98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:35.076 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:35.080+0000 7f2990ba98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:35.077 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:35.080+0000 7f2990ba98c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:35.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:35.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:36.281 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:36.284+0000 7f2990ba98c0 -1 Falling back to public interface 2026-03-08T23:26:36.418 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:26:36.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:36.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:36.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:36.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:36.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:36.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:37.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:37.272+0000 7f2990ba98c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:37.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:37.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:38.338 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:38.340+0000 7f298c362640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:26:38.782 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:26:38.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:38.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:38.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:38.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:38.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:38.946 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/170050450,v1:127.0.0.1:6819/170050450] [v2:127.0.0.1:6820/170050450,v1:127.0.0.1:6821/170050450] exists,up c91eaba1-015c-41de-9e74-419a3fe06dc7 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:859: unfound_erasure_coded: for id in $(seq 0 3) 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:860: unfound_erasure_coded: run_osd td/osd-scrub-repair 3 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=3 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:38.947 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:26:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:26:38.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:26:38.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:26:38.950 INFO:tasks.workunit.client.0.vm03.stdout:add osd3 f024040b-6f96-4a3e-b783-b9782ad81ba9 2026-03-08T23:26:38.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f024040b-6f96-4a3e-b783-b9782ad81ba9 2026-03-08T23:26:38.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd3 f024040b-6f96-4a3e-b783-b9782ad81ba9' 2026-03-08T23:26:38.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:26:38.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCuBa5pZz/RORAAIpjs8lCNIiQ7bWb4Ioke9Q== 2026-03-08T23:26:38.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCuBa5pZz/RORAAIpjs8lCNIiQ7bWb4Ioke9Q=="}' 2026-03-08T23:26:38.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f024040b-6f96-4a3e-b783-b9782ad81ba9 -i td/osd-scrub-repair/3/new.json 2026-03-08T23:26:39.143 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:26:39.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-repair/3/new.json 2026-03-08T23:26:39.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCuBa5pZz/RORAAIpjs8lCNIiQ7bWb4Ioke9Q== --osd-uuid f024040b-6f96-4a3e-b783-b9782ad81ba9 2026-03-08T23:26:39.173 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:39.176+0000 7f3621f098c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:39.176 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:39.180+0000 7f3621f098c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:39.177 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:39.180+0000 7f3621f098c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:39.177 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:39.180+0000 7f3621f098c0 -1 bdev(0x55885914fc00 td/osd-scrub-repair/3/block) open stat got: (1) Operation not permitted 2026-03-08T23:26:39.177 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:39.180+0000 7f3621f098c0 -1 bluestore(td/osd-scrub-repair/3) _read_fsid unparsable uuid 2026-03-08T23:26:41.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-repair/3/keyring 2026-03-08T23:26:41.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:26:41.425 INFO:tasks.workunit.client.0.vm03.stdout:adding osd3 key to auth repository 2026-03-08T23:26:41.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd3 key to auth repository 2026-03-08T23:26:41.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-repair/3/keyring auth add osd.3 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:26:41.636 INFO:tasks.workunit.client.0.vm03.stdout:start osd.3 2026-03-08T23:26:41.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.3 2026-03-08T23:26:41.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:41.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:26:41.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:26:41.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:26:41.653 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:41.656+0000 7f7394a0c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:41.654 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:41.656+0000 7f7394a0c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:41.656 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:41.656+0000 7f7394a0c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 3 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:41.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:42.857 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:42.860+0000 7f7394a0c8c0 -1 Falling back to public interface 2026-03-08T23:26:42.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:42.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:42.992 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:26:42.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:42.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:42.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:43.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:43.839 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:26:43.840+0000 7f7394a0c8c0 -1 osd.3 0 log_to_monitors true 2026-03-08T23:26:44.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:44.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:44.183 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:26:44.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:44.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:44.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:44.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:45.388 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:26:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:45.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stdout:osd.3 up in weight 1 up_from 20 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6826/1371036095,v1:127.0.0.1:6827/1371036095] [v2:127.0.0.1:6828/1371036095,v1:127.0.0.1:6829/1371036095] exists,up f024040b-6f96-4a3e-b783-b9782ad81ba9 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:863: unfound_erasure_coded: create_ec_pool ecpool true k=2 m=2 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2498: create_ec_pool: local pool_name=ecpool 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2499: create_ec_pool: shift 2026-03-08T23:26:45.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2500: create_ec_pool: local allow_overwrites=true 2026-03-08T23:26:45.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2501: create_ec_pool: shift 2026-03-08T23:26:45.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2503: create_ec_pool: ceph osd erasure-code-profile set myprofile crush-failure-domain=osd k=2 m=2 2026-03-08T23:26:45.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2505: create_ec_pool: create_pool ecpool 1 1 erasure myprofile 2026-03-08T23:26:45.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create ecpool 1 1 erasure myprofile 2026-03-08T23:26:46.098 INFO:tasks.workunit.client.0.vm03.stderr:pool 'ecpool' created 2026-03-08T23:26:46.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:26:47.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2507: create_ec_pool: '[' true = true ']' 2026-03-08T23:26:47.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2508: create_ec_pool: ceph osd pool set ecpool allow_ec_overwrites true 2026-03-08T23:26:47.332 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 allow_ec_overwrites to true 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2511: create_ec_pool: wait_for_clean 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:47.351 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:47.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:47.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:47.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:47.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:47.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:47.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:47.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:47.592 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:26:47.592 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:26:47.592 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T23:26:47.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:47.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:47.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:47.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836486 2026-03-08T23:26:47.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836486 2026-03-08T23:26:47.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486' 2026-03-08T23:26:47.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:47.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:47.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672965 2026-03-08T23:26:47.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672965 2026-03-08T23:26:47.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965' 2026-03-08T23:26:47.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:47.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:26:47.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T23:26:47.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T23:26:47.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443' 2026-03-08T23:26:47.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:47.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:47.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=85899345922 2026-03-08T23:26:47.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 85899345922 2026-03-08T23:26:47.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836486 1-42949672965 2-64424509443 3-85899345922' 2026-03-08T23:26:47.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:47.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836486 2026-03-08T23:26:47.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:47.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:47.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836486 2026-03-08T23:26:47.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:47.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836486 2026-03-08T23:26:47.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836486' 2026-03-08T23:26:47.943 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836486 2026-03-08T23:26:47.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:48.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836486 2026-03-08T23:26:48.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:49.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:26:49.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:49.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836486 2026-03-08T23:26:49.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:49.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672965 2026-03-08T23:26:49.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:49.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:49.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672965 2026-03-08T23:26:49.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:49.297 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672965 2026-03-08T23:26:49.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672965 2026-03-08T23:26:49.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672965' 2026-03-08T23:26:49.298 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:49.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672965 2026-03-08T23:26:49.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:49.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T23:26:49.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:49.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:26:49.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T23:26:49.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:49.481 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T23:26:49.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T23:26:49.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T23:26:49.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:26:49.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509444 -lt 64424509443 2026-03-08T23:26:49.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:49.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-85899345922 2026-03-08T23:26:49.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:49.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:26:49.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-85899345922 2026-03-08T23:26:49.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:49.662 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.3 seq 85899345922 2026-03-08T23:26:49.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=85899345922 2026-03-08T23:26:49.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 85899345922' 2026-03-08T23:26:49.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:26:49.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 85899345922 -lt 85899345922 2026-03-08T23:26:49.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:26:49.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:49.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:26:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:26:50.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:26:50.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:26:50.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:26:50.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2512: create_ec_pool: return 0 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:865: unfound_erasure_coded: add_something td/osd-scrub-repair ecpool 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:71: add_something: local dir=td/osd-scrub-repair 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:72: add_something: local poolname=ecpool 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:73: add_something: local obj=SOMETHING 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:74: add_something: local scrub=noscrub 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:76: add_something: '[' noscrub = noscrub ']' 2026-03-08T23:26:50.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:78: add_something: ceph osd set noscrub 2026-03-08T23:26:50.701 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:26:50.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:79: add_something: ceph osd set nodeep-scrub 2026-03-08T23:26:50.910 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:26:50.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:85: add_something: local payload=ABCDEF 2026-03-08T23:26:50.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:86: add_something: echo ABCDEF 2026-03-08T23:26:50.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:87: add_something: rados --pool ecpool put SOMETHING td/osd-scrub-repair/ORIGINAL 2026-03-08T23:26:50.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:867: unfound_erasure_coded: get_primary ecpool SOMETHING 2026-03-08T23:26:50.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=ecpool 2026-03-08T23:26:50.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:26:50.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:26:50.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:26:51.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:867: unfound_erasure_coded: local primary=1 2026-03-08T23:26:51.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: get_osds ecpool SOMETHING 2026-03-08T23:26:51.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: sed -e s/1// 2026-03-08T23:26:51.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1021: get_osds: local poolname=ecpool 2026-03-08T23:26:51.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1022: get_osds: local objectname=SOMETHING 2026-03-08T23:26:51.146 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:26:51.146 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: jq '.acting | .[]' 2026-03-08T23:26:51.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1025: get_osds: local 'osds=1 2026-03-08T23:26:51.346 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:26:51.346 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:26:51.346 INFO:tasks.workunit.client.0.vm03.stderr:3' 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1027: get_osds: echo 1 0 2 3 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: osds=('0' '2' '3') 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:868: unfound_erasure_coded: local -a osds 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:869: unfound_erasure_coded: local not_primary_first=0 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:870: unfound_erasure_coded: local not_primary_second=2 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:871: unfound_erasure_coded: local not_primary_third=3 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:876: unfound_erasure_coded: pids= 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:877: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 416257"' 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 416257' 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:878: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:51.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 416258"' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 416258' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:879: unfound_erasure_coded: run_in_background pids objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2190: run_in_background: local pid_variable=pids 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2191: run_in_background: shift 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/416260: /' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: eval 'pids+=" 416261"' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2195: run_in_background: pids+=' 416261' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:880: unfound_erasure_coded: wait_background pids 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2226: wait_background: pids=' 416257 416258 416261' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2228: wait_background: return_code=0 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 416257 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/416263: /' 2026-03-08T23:26:51.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:51.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: sed 's/^/416265: /' 2026-03-08T23:26:51.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:51.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: objectstore_tool td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=3 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:52.668 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 3 SOMETHING remove 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/3 SOMETHING remove 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: remove 3#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/3 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:52.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/3' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/3/journal' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:52.671 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/3 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.3 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: start osd.3 2026-03-08T23:26:52.672 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 3 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/3 --osd-journal=td/osd-scrub-repair/3/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:26:52.676 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 2 SOMETHING remove 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/2 SOMETHING remove 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: remove 2#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/2 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:52.677 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: 416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-repair TERM osd.0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-repair 0 SOMETHING remove 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-repair/0 SOMETHING remove 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: remove 1#1:eb822e21:::SOMETHING:head# 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-repair 0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-repair 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-repair/0 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false ' 2026-03-08T23:26:52.691 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/2' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/2/journal' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/2 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: start osd.2 2026-03-08T23:26:52.696 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/2/whoami 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/2 --osd-journal=td/osd-scrub-repair/2/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-maxactivate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-repair/0' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-repair/0/journal' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-repair' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:26:52.711 INFO:tasks.workunit.client.0.vm03.stderr:416260: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-repair/$name.log' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-repair/$name.pid' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-repair/0 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: start osd.0 2026-03-08T23:26:52.712 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/0/whoami 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=3f205671-6f4e-43b2-8df9-7958536b87d7 --auth-supported=none --mon-host=127.0.0.1:7107 --osd-skip-data-digest=false --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-repair/0 --osd-journal=td/osd-scrub-repair/0/journal --chdir= --run-dir=td/osd-scrub-repair '--admin-socket=/tmp/ceph-asok.43024/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-repair/$name.log' '--pid-file=td/osd-scrub-repair/$name.pid' --osd-max-object-name-len=460 --osd-max416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-repair/3/whoami 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 3 = 3 ']' 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2026-03-08T23:26:52.696+0000 7fe4c69a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2026-03-08T23:26:52.712+0000 7fe4c69a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2026-03-08T23:26:52.724+0000 7fe4c69a18c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 3 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=3 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 0 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2026-03-08T23:26:53.436+0000 7fe4c69a18c0 -1 Falling back to public interface 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:56.987 INFO:tasks.workunit.client.0.vm03.stderr:416263: 1 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2026-03-08T23:26:54.476+0000 7fe4c69a18c0 -1 osd.3 29 log_to_monitors true 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: 3 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:56.988 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:57.093 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2026-03-08T23:26:52.764+0000 7f063109c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2026-03-08T23:26:52.788+0000 7f063109c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2026-03-08T23:26:52.796+0000 7f063109c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 0 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2026-03-08T23:26:53.512+0000 7f063109c8c0 -1 Falling back to public interface 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2026-03-08T23:26:54.916+0000 7f063109c8c0 -1 osd.0 29 log_to_monitors true 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: 3 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.094 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helper-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2026-03-08T23:26:52.756+0000 7f01fcf858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2026-03-08T23:26:52.776+0000 7f01fcf858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2026-03-08T23:26:52.784+0000 7f01fcf858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 0 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2026-03-08T23:26:53.476+0000 7f01fcf858c0 -1 Falling back to public interface 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 1 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2026-03-08T23:26:54.516+0000 7f01fcf858c0 -1 osd.2 29 log_to_monitors true 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.156 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: 3 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:57.157 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:26:57.654 INFO:tasks.workunit.client.0.vm03.stderr:416265: osd.2 up in weight 1 up_from 34 up_thru 0 down_at 30 last_clean_interval [15,29) [v2:127.0.0.1:6818/3099142952,v1:127.0.0.1:6819/3099142952] [v2:127.0.0.1:6820/3099142952,v1:127.0.0.1:6821/3099142952] exists,up c91eaba1-015c-41de-9e74-419a3fe06dc7 2026-03-08T23:26:57.654 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: 1 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: 2 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: 3' 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855362 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855362 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362' 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362 1-42949672968' 2026-03-08T23:26:57.655 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph s.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: 4 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.3 up' 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: osd.3 up in weight 1 up_from 34 up_thru 0 down_at 30 last_clean_interval [20,29) [v2:127.0.0.1:6802/2121844942,v1:127.0.0.1:6803/2121844942] [v2:127.0.0.1:6804/2121844942,v1:127.0.0.1:6805/2121844942] exists,up f024040b-6f96-4a3e-b783-b9782ad81ba9 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:58.554 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: 1 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: 2 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: 3' 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855363 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855363 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855363' 2026-03-08T23:26:58.555 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-hs.sh:985: wait_for_osd: sleep 1 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: 4 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: osd.0 up in weight 1 up_from 35 up_thru 0 down_at 30 last_clean_interval [5,29) [v2:127.0.0.1:6826/2355399249,v1:127.0.0.1:6827/2355399249] [v2:127.0.0.1:6828/2355399249,v1:127.0.0.1:6829/2355399249] exists,up 8256f948-8a50-451a-be57-db2f5c1f156a 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: ///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: 1 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: 2 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: 3' 2026-03-08T23:26:58.669 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=150323855364 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 150323855364 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364' 2026-03-08T23:26:58.670 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364 1-42949672970' 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888068 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888068 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364 1-42949672970 2-146028888068' 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888068 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888068 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855364 1-42949672970 2-146028888068 3-146028888068' 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855364 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855364 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855364 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855364' 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: waiting osd.0 seq 150323855364 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855364 -lt 150323855364 2026-03-08T23:26:59.328 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: waiting osd.1 seq 42949672970 2026-03-08T23:26:59.329 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:22tell osd.2 flush_pg_stats 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362 1-42949672968 2-146028888066' 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888066 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888066 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855362 1-42949672968 2-146028888066 3-146028888066' 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855362 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855362 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855362 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855362' 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: waiting osd.0 seq 150323855362 2026-03-08T23:26:59.432 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 150323855362 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855364 -lt 150323855362 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: waiting osd.1 seq 42949672968 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672968 2026-03-08T23:26:59.433 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416265: //hom77: flush_pg_stats: test 42949672970 -lt 42949672970 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-146028888068 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-146028888068 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888068 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 146028888068' 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: waiting osd.2 seq 146028888068 2026-03-08T23:27:00.215 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 146028888068' 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: waiting osd.3 seq 146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888068 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:27:00.216 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_cleane/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 146028888066' 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: waiting osd.2 seq 146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 146028888066' 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: waiting osd.3 seq 146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888066 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:27:00.265 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:27:00.458 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qelpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:27:00.458 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672969 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672969 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855363 1-42949672969' 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888067 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888067 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855363 1-42949672969 2-146028888067' 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.3 flush_pg_stats 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=146028888067 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 146028888067 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-150323855363 1-42949672969 2-146028888067 3-146028888067' 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 150323855363' 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: waiting osd.0 seq 150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 150323855364 -lt 150323855363 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672969 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672969 2026-03-08T23:27:00.459 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubunta/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:416265: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:416265: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:27:00.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:: cur_active_clean=1 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:416260: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 416258 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2229: wait_background: for pid in $pids 2026-03-08T23:27:00.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2230: wait_background: wait 416261 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:u/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672969 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672969' 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: waiting osd.1 seq 42949672969 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672969 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 146028888067' 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: waiting osd.2 seq 146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888068 -lt 146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 3-146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=3 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 3-146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.3 seq 146028888067' 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: waiting osd.3 seq 146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 3 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 146028888069 -lt 146028888067 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:27:01.254 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: //home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:416263: /home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2194: run_in_background: return 0 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: eval 'pids='\'''\''' 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2237: wait_background: pids= 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2239: wait_background: return 0 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:881: unfound_erasure_coded: return_code=0 2026-03-08T23:27:01.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:882: unfound_erasure_coded: '[' 0 -ne 0 ']' 2026-03-08T23:27:01.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:887: unfound_erasure_coded: get_pg ecpool SOMETHING 2026-03-08T23:27:01.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=ecpool 2026-03-08T23:27:01.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:27:01.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map ecpool SOMETHING 2026-03-08T23:27:01.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:887: unfound_erasure_coded: local pg=1.0 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:888: unfound_erasure_coded: repair 1.0 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:01.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:02.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:02.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T23:27:02.208 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0s0 on osd.1 to repair 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:02.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:02.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:02.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:02.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:02.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:03.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:03.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:03.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:03.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:04.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:04.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:04.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:04.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:05.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:05.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:05.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:05.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:05.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:05.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:05.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:06.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:06.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:07.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:07.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:08.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:08.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:26:46.100834+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:08.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:27:09.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:27:09.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:27:09.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:27:03.043964+0000 '>' 2026-03-08T23:26:46.100834+0000 2026-03-08T23:27:09.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:27:09.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:893: unfound_erasure_coded: seq 1 60 2026-03-08T23:27:09.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:893: unfound_erasure_coded: for f in `seq 1 60` 2026-03-08T23:27:09.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: ceph -s 2026-03-08T23:27:09.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: grep '1/1 objects unfound' 2026-03-08T23:27:09.689 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:27:09.689 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:27:09.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:894: unfound_erasure_coded: break 2026-03-08T23:27:09.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:897: unfound_erasure_coded: ceph -s 2026-03-08T23:27:09.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:897: unfound_erasure_coded: grep '4 up' 2026-03-08T23:27:09.901 INFO:tasks.workunit.client.0.vm03.stdout: osd: 4 osds: 4 up (since 12s), 4 in (since 30s) 2026-03-08T23:27:09.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:898: unfound_erasure_coded: ceph -s 2026-03-08T23:27:09.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:898: unfound_erasure_coded: grep '4 in' 2026-03-08T23:27:10.117 INFO:tasks.workunit.client.0.vm03.stdout: osd: 4 osds: 4 up (since 13s), 4 in (since 30s) 2026-03-08T23:27:10.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:899: unfound_erasure_coded: ceph -s 2026-03-08T23:27:10.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:899: unfound_erasure_coded: grep '1/1 objects unfound' 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stdout: 1/1 objects unfound (100.000%) 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-repair.sh:66: run: teardown td/osd-scrub-repair 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:27:10.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:27:10.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:27:10.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:27:10.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:27:10.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:27:10.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:27:10.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:27:10.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:27:10.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:27:10.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:27:10.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:27:10.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:27:10.451 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:27:10.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:27:10.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:27:10.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:27:10.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:27:10.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:27:10.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:27:10.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:27:10.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-scrub-repair 0 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-repair 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-repair KILL 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:27:10.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:27:10.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:27:10.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:27:10.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:27:10.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:27:10.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:27:10.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:27:10.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:27:10.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:27:10.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:27:10.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:27:10.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:27:10.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:27:10.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:27:10.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T23:27:10.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-repair 2026-03-08T23:27:10.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:27:10.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.43024 2026-03-08T23:27:10.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.43024 2026-03-08T23:27:10.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:27:10.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:27:10.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T23:27:10.484 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T23:27:10.484 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T23:27:10.536 INFO:tasks.workunit:Running workunit scrub/osd-scrub-snaps.sh... 2026-03-08T23:27:10.537 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-scrub-snaps.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh 2026-03-08T23:27:10.588 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-scrub-snaps 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:27: run: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:28: run: shift 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:30: run: export CEPH_MON=127.0.0.1:7121 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:30: run: CEPH_MON=127.0.0.1:7121 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:31: run: export CEPH_ARGS 2026-03-08T23:27:10.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:32: run: uuidgen 2026-03-08T23:27:10.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:32: run: CEPH_ARGS+='--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none ' 2026-03-08T23:27:10.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:33: run: CEPH_ARGS+='--mon-host=127.0.0.1:7121 ' 2026-03-08T23:27:10.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:35: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T23:27:10.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:36: run: set 2026-03-08T23:27:10.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:36: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:36: run: local 'funcs=TEST_scrub_snaps 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_snaps_primary 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_snaps_replica' 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:37: run: for func in $funcs 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:38: run: setup td/osd-scrub-snaps 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-snaps 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:27:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:27:10.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:27:10.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:27:10.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:27:10.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:27:10.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:27:10.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:27:10.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:27:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:27:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:27:10.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:27:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:27:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:27:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:27:10.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:27:10.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.602 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:27:10.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:27:10.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:27:10.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-snaps 2026-03-08T23:27:10.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:27:10.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.420670 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-snaps 1' TERM HUP INT 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:39: run: TEST_scrub_snaps td/osd-scrub-snaps 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:171: TEST_scrub_snaps: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:172: TEST_scrub_snaps: local poolname=test 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:173: TEST_scrub_snaps: local OBJS=16 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:174: TEST_scrub_snaps: local OSDS=1 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:176: TEST_scrub_snaps: TESTDATA=testdata.420670 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:178: TEST_scrub_snaps: run_mon td/osd-scrub-snaps a --osd_pool_default_size=1 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-snaps/a 2026-03-08T23:27:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-snaps/a --run-dir=td/osd-scrub-snaps --osd_pool_default_size=1 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:27:10.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-snaps/a '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-snaps/log --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=1 2026-03-08T23:27:10.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:27:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:27:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:27:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:27:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:27:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:27:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:27:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:27:10.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:27:10.665 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:27:10.665 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.665 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.666 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:27:10.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:27:10.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get fsid 2026-03-08T23:27:10.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:27:10.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get mon_host 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:179: TEST_scrub_snaps: run_mgr td/osd-scrub-snaps x 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-snaps/x 2026-03-08T23:27:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:27:10.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:27:10.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-snaps/x '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:27:10.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:180: TEST_scrub_snaps: expr 1 - 1 2026-03-08T23:27:10.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:180: TEST_scrub_snaps: seq 0 0 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:180: TEST_scrub_snaps: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:182: TEST_scrub_snaps: run_osd td/osd-scrub-snaps 0 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:27:10.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:27:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:27:10.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:27:10.956 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 4b01d465-0e34-439f-9731-c93aae66efd4 2026-03-08T23:27:10.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=4b01d465-0e34-439f-9731-c93aae66efd4 2026-03-08T23:27:10.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 4b01d465-0e34-439f-9731-c93aae66efd4' 2026-03-08T23:27:10.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:27:10.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDOBa5pmJIiOhAAi8Vp6ICk2sDMlwjpOm6uJw== 2026-03-08T23:27:10.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDOBa5pmJIiOhAAi8Vp6ICk2sDMlwjpOm6uJw=="}' 2026-03-08T23:27:10.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 4b01d465-0e34-439f-9731-c93aae66efd4 -i td/osd-scrub-snaps/0/new.json 2026-03-08T23:27:11.081 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:27:11.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-snaps/0/new.json 2026-03-08T23:27:11.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQDOBa5pmJIiOhAAi8Vp6ICk2sDMlwjpOm6uJw== --osd-uuid 4b01d465-0e34-439f-9731-c93aae66efd4 2026-03-08T23:27:11.112 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:11.112+0000 7fbaf7ee68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:11.114 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:11.116+0000 7fbaf7ee68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:11.116 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:11.120+0000 7fbaf7ee68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:11.116 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:11.120+0000 7fbaf7ee68c0 -1 bdev(0x56364038ac00 td/osd-scrub-snaps/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:27:11.116 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:11.120+0000 7fbaf7ee68c0 -1 bluestore(td/osd-scrub-snaps/0) _read_fsid unparsable uuid 2026-03-08T23:27:13.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-snaps/0/keyring 2026-03-08T23:27:13.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:27:13.375 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:27:13.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:27:13.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-snaps/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:27:13.477 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:27:13.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:27:13.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:27:13.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:27:13.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:27:13.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:27:13.517 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:13.520+0000 7eff546c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:13.533 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:13.536+0000 7eff546c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:13.544 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:13.544+0000 7eff546c88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:27:13.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:27:13.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:27:13.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:27:14.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:14.740+0000 7eff546c88c0 -1 Falling back to public interface 2026-03-08T23:27:14.743 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:27:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:27:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:27:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:27:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:27:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:27:14.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:27:15.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:15.732+0000 7eff546c88c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:27:15.921 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:27:15.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:27:15.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:27:15.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:27:15.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:27:15.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:27:16.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:27:17.109 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:27:17.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:27:17.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:27:17.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:27:17.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:27:17.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:27:17.313 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:27:17.316+0000 7eff4fe81640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:27:17.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:27:18.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:27:18.503 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/867448566,v1:127.0.0.1:6803/867448566] [v2:127.0.0.1:6804/867448566,v1:127.0.0.1:6805/867448566] exists,up 4b01d465-0e34-439f-9731-c93aae66efd4 2026-03-08T23:27:18.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:27:18.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:27:18.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:27:18.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:186: TEST_scrub_snaps: ceph osd set noscrub 2026-03-08T23:27:18.715 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:27:18.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:187: TEST_scrub_snaps: ceph osd set nodeep-scrub 2026-03-08T23:27:18.947 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:27:18.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:190: TEST_scrub_snaps: create_pool test 1 1 2026-03-08T23:27:18.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:27:19.153 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:27:19.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:191: TEST_scrub_snaps: wait_for_clean 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:27:20.171 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:27:20.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:27:20.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:27:20.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:27:20.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:27:20.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:27:20.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:27:20.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:27:20.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836482 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836482 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836482' 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836482 2026-03-08T23:27:20.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:27:20.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:27:20.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836482 2026-03-08T23:27:20.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:27:20.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836482 2026-03-08T23:27:20.538 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836482 2026-03-08T23:27:20.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836482' 2026-03-08T23:27:20.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:27:20.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836481 -lt 21474836482 2026-03-08T23:27:20.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:27:21.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:27:21.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:27:21.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836482 2026-03-08T23:27:21.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:27:21.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:21.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:27:22.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:27:22.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:27:22.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:27:22.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:27:22.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:27:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:27:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:27:22.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:27:22.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:192: TEST_scrub_snaps: ceph osd dump 2026-03-08T23:27:22.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:192: TEST_scrub_snaps: awk '{ print $2 }' 2026-03-08T23:27:22.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:192: TEST_scrub_snaps: grep '^pool.*['\'']test['\'']' 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:192: TEST_scrub_snaps: poolid=1 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:194: TEST_scrub_snaps: dd if=/dev/urandom of=testdata.420670 bs=1032 count=1 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 9.3495e-05 s, 11.0 MB/s 2026-03-08T23:27:22.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: seq 1 16 2026-03-08T23:27:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj1 testdata.420670 2026-03-08T23:27:22.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj2 testdata.420670 2026-03-08T23:27:22.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj3 testdata.420670 2026-03-08T23:27:22.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj4 testdata.420670 2026-03-08T23:27:22.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj5 testdata.420670 2026-03-08T23:27:22.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj6 testdata.420670 2026-03-08T23:27:22.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj7 testdata.420670 2026-03-08T23:27:22.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj8 testdata.420670 2026-03-08T23:27:22.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj9 testdata.420670 2026-03-08T23:27:22.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj10 testdata.420670 2026-03-08T23:27:22.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj11 testdata.420670 2026-03-08T23:27:22.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj12 testdata.420670 2026-03-08T23:27:22.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj13 testdata.420670 2026-03-08T23:27:22.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj14 testdata.420670 2026-03-08T23:27:22.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj15 testdata.420670 2026-03-08T23:27:22.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:195: TEST_scrub_snaps: for i in `seq 1 $OBJS` 2026-03-08T23:27:22.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:197: TEST_scrub_snaps: rados -p test put obj16 testdata.420670 2026-03-08T23:27:22.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:200: TEST_scrub_snaps: get_primary test obj1 2026-03-08T23:27:22.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:27:22.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:27:22.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:27:22.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:27:23.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:200: TEST_scrub_snaps: local primary=0 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:202: TEST_scrub_snaps: create_scenario td/osd-scrub-snaps test testdata.420670 0 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:45: create_scenario: local dir=td/osd-scrub-snaps 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:46: create_scenario: local poolname=test 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:47: create_scenario: local TESTDATA=testdata.420670 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:48: create_scenario: local osd=0 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:50: create_scenario: SNAP=1 2026-03-08T23:27:23.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:51: create_scenario: rados -p test mksnap snap1 2026-03-08T23:27:23.249 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap1 2026-03-08T23:27:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:52: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=1 2026-03-08T23:27:23.253 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:27:23.253 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:27:23.253 INFO:tasks.workunit.client.0.vm03.stderr:256 bytes copied, 7.9407e-05 s, 3.2 MB/s 2026-03-08T23:27:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:53: create_scenario: rados -p test put obj1 testdata.420670 2026-03-08T23:27:23.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:54: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:27:23.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:55: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:27:23.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: seq 6 14 2026-03-08T23:27:23.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj6 testdata.420670 2026-03-08T23:27:23.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj7 testdata.420670 2026-03-08T23:27:23.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj8 testdata.420670 2026-03-08T23:27:23.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj9 testdata.420670 2026-03-08T23:27:23.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj10 testdata.420670 2026-03-08T23:27:23.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj11 testdata.420670 2026-03-08T23:27:23.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj12 testdata.420670 2026-03-08T23:27:23.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj13 testdata.420670 2026-03-08T23:27:23.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:27:23.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj14 testdata.420670 2026-03-08T23:27:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:60: create_scenario: SNAP=2 2026-03-08T23:27:23.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:61: create_scenario: rados -p test mksnap snap2 2026-03-08T23:27:23.618 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap2 2026-03-08T23:27:23.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:62: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=2 2026-03-08T23:27:23.621 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records in 2026-03-08T23:27:23.621 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records out 2026-03-08T23:27:23.621 INFO:tasks.workunit.client.0.vm03.stderr:512 bytes copied, 5.865e-05 s, 8.7 MB/s 2026-03-08T23:27:23.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:63: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:27:23.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:65: create_scenario: SNAP=3 2026-03-08T23:27:23.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:66: create_scenario: rados -p test mksnap snap3 2026-03-08T23:27:23.734 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap3 2026-03-08T23:27:23.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:67: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=3 2026-03-08T23:27:23.738 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records in 2026-03-08T23:27:23.738 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records out 2026-03-08T23:27:23.738 INFO:tasks.workunit.client.0.vm03.stderr:768 bytes copied, 7.7846e-05 s, 9.9 MB/s 2026-03-08T23:27:23.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:68: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:27:23.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:70: create_scenario: SNAP=4 2026-03-08T23:27:23.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:71: create_scenario: rados -p test mksnap snap4 2026-03-08T23:27:23.829 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap4 2026-03-08T23:27:23.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:72: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=4 2026-03-08T23:27:23.833 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records in 2026-03-08T23:27:23.833 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records out 2026-03-08T23:27:23.833 INFO:tasks.workunit.client.0.vm03.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 5.9662e-05 s, 17.2 MB/s 2026-03-08T23:27:23.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:73: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:27:23.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:74: create_scenario: rados -p test put obj2 testdata.420670 2026-03-08T23:27:23.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:76: create_scenario: SNAP=5 2026-03-08T23:27:23.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:77: create_scenario: rados -p test mksnap snap5 2026-03-08T23:27:23.933 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap5 2026-03-08T23:27:23.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:78: create_scenario: SNAP=6 2026-03-08T23:27:23.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:79: create_scenario: rados -p test mksnap snap6 2026-03-08T23:27:24.037 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap6 2026-03-08T23:27:24.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:80: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=6 2026-03-08T23:27:24.041 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records in 2026-03-08T23:27:24.041 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records out 2026-03-08T23:27:24.041 INFO:tasks.workunit.client.0.vm03.stderr:1536 bytes (1.5 kB, 1.5 KiB) copied, 8.9617e-05 s, 17.1 MB/s 2026-03-08T23:27:24.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:81: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:27:24.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:83: create_scenario: SNAP=7 2026-03-08T23:27:24.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:84: create_scenario: rados -p test mksnap snap7 2026-03-08T23:27:24.142 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap7 2026-03-08T23:27:24.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:86: create_scenario: rados -p test rm obj4 2026-03-08T23:27:24.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:87: create_scenario: rados -p test rm obj16 2026-03-08T23:27:24.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:88: create_scenario: rados -p test rm obj2 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:90: create_scenario: kill_daemons td/osd-scrub-snaps TERM osd 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:27:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:27:24.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:27:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj1 2026-03-08T23:27:24.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: JSON='["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:24.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:95: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' --force remove 2026-03-08T23:27:25.583 INFO:tasks.workunit.client.0.vm03.stdout:WARNING: only removing head with clones present 2026-03-08T23:27:25.583 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T23:27:26.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:27:26.115 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: grep '"snapid":2' 2026-03-08T23:27:26.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:26.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:98: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:27:27.593 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:2# 2026-03-08T23:27:28.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:27:28.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: grep '"snapid":1' 2026-03-08T23:27:28.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:28.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:101: create_scenario: OBJ5SAVE='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:28.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:103: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:27:30.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:104: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:30.047 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:105: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:30.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:106: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --rmtype nosnapmap '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:27:30.681 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:1# 2026-03-08T23:27:31.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:108: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:109: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:27:32.291 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:110: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:32.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:111: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:27:32.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:27:32.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: grep '"snapid":4' 2026-03-08T23:27:33.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:33.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:114: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=18 2026-03-08T23:27:33.132 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records in 2026-03-08T23:27:33.132 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records out 2026-03-08T23:27:33.132 INFO:tasks.workunit.client.0.vm03.stderr:4608 bytes (4.6 kB, 4.5 KiB) copied, 0.000107051 s, 43.0 MB/s 2026-03-08T23:27:33.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:115: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:27:34.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj3 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: JSON='["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:118: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=15 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records in 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records out 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:3840 bytes (3.8 kB, 3.8 KiB) copied, 0.000123371 s, 31.1 MB/s 2026-03-08T23:27:35.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:119: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:27:36.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj4 2026-03-08T23:27:36.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: grep '"snapid":7' 2026-03-08T23:27:37.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: JSON='["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:37.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:122: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:27:37.791 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:0ee9ae15:::obj4:7# 2026-03-08T23:27:38.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:125: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:27:39.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:126: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.399 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:127: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj16 2026-03-08T23:27:39.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: grep '"snapid":7' 2026-03-08T23:27:40.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: JSON='["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:40.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:129: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --rmtype snapmap '["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:27:40.871 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:461f8b5e:::obj16:7# 2026-03-08T23:27:41.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:131: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:132: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:27:42.491 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:133: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:27:42.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:134: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:27:42.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj2 2026-03-08T23:27:43.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: JSON='["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:43.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:137: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' rm-attr snapset 2026-03-08T23:27:44.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: echo '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:44.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: sed 's/snapid":1/snapid":7/' 2026-03-08T23:27:44.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:44.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:141: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=7 2026-03-08T23:27:44.508 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records in 2026-03-08T23:27:44.508 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records out 2026-03-08T23:27:44.508 INFO:tasks.workunit.client.0.vm03.stderr:1792 bytes (1.8 kB, 1.8 KiB) copied, 0.000153827 s, 11.6 MB/s 2026-03-08T23:27:44.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:142: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:27:45.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj6 2026-03-08T23:27:45.973 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:46.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: JSON='["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:46.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:145: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset 2026-03-08T23:27:47.671 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj7 2026-03-08T23:27:47.979 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:48.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: JSON='["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:48.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:147: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset corrupt 2026-03-08T23:27:49.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj8 2026-03-08T23:27:49.986 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:50.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: JSON='["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:50.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:149: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset seq 2026-03-08T23:27:51.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj9 2026-03-08T23:27:51.995 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:52.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: JSON='["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:52.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:151: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_size 2026-03-08T23:27:53.695 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj10 2026-03-08T23:27:54.004 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:54.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: JSON='["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:54.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:153: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_overlap 2026-03-08T23:27:55.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj11 2026-03-08T23:27:55.999 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:56.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: JSON='["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:56.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:155: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clones 2026-03-08T23:27:57.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj12 2026-03-08T23:27:58.015 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:27:58.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: JSON='["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:27:58.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:157: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset head 2026-03-08T23:27:59.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj13 2026-03-08T23:28:00.037 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:28:00.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: JSON='["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:00.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:159: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset snaps 2026-03-08T23:28:01.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj14 2026-03-08T23:28:02.049 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:28:02.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: JSON='["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:02.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:161: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset size 2026-03-08T23:28:03.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:163: create_scenario: echo garbage 2026-03-08T23:28:03.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj15 2026-03-08T23:28:04.053 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:28:04.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: JSON='["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:04.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:165: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' set-attr snapset td/osd-scrub-snaps/bad 2026-03-08T23:28:05.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:166: create_scenario: rm -f td/osd-scrub-snaps/bad 2026-03-08T23:28:05.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:167: create_scenario: return 0 2026-03-08T23:28:05.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:204: TEST_scrub_snaps: rm -f testdata.420670 2026-03-08T23:28:05.748 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:206: TEST_scrub_snaps: expr 1 - 1 2026-03-08T23:28:05.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:206: TEST_scrub_snaps: seq 0 0 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:206: TEST_scrub_snaps: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:208: TEST_scrub_snaps: activate_osd td/osd-scrub-snaps 0 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:05.750 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:28:05.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:28:05.752 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:28:05.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:28:05.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:28:05.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-snaps/0/whoami 2026-03-08T23:28:05.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:28:05.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:28:05.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:28:05.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:28:05.766 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:05.768+0000 7ff15266d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:05.776 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:05.780+0000 7ff15266d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:05.777 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:05.780+0000 7ff15266d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:05.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:06.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:06.737 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:06.740+0000 7ff15266d8c0 -1 Falling back to public interface 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:07.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:07.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:07.712 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:07.716+0000 7ff15266d8c0 -1 osd.0 18 log_to_monitors true 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:08.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:08.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:09.629 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 21 up_thru 21 down_at 19 last_clean_interval [5,18) [v2:127.0.0.1:6802/2425897067,v1:127.0.0.1:6803/2425897067] [v2:127.0.0.1:6804/2425897067,v1:127.0.0.1:6805/2425897067] exists,up 4b01d465-0e34-439f-9731-c93aae66efd4 2026-03-08T23:28:09.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:28:09.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:28:09.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:28:09.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:210: TEST_scrub_snaps: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_max 25 2026-03-08T23:28:09.699 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:09.699 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:28:09.699 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:09.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:211: TEST_scrub_snaps: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_min 5 2026-03-08T23:28:09.783 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:09.783 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:28:09.783 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:09.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:212: TEST_scrub_snaps: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_seconds 1 2026-03-08T23:28:09.859 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:09.859 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_seconds = '' (not observed, change may require restart) " 2026-03-08T23:28:09.859 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:09.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:213: TEST_scrub_snaps: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_epochs 1 2026-03-08T23:28:09.935 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:09.935 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_epochs = '' (not observed, change may require restart) " 2026-03-08T23:28:09.935 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:216: TEST_scrub_snaps: wait_for_clean 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:28:09.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:28:09.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:28:09.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:28:09.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:28:10.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:28:10.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:28:10.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:28:10.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:28:10.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:28:10.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=90194313218 2026-03-08T23:28:10.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 90194313218 2026-03-08T23:28:10.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-90194313218' 2026-03-08T23:28:10.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:28:10.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-90194313218 2026-03-08T23:28:10.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:28:10.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:28:10.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-90194313218 2026-03-08T23:28:10.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:28:10.249 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 90194313218 2026-03-08T23:28:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=90194313218 2026-03-08T23:28:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 90194313218' 2026-03-08T23:28:10.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:10.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 90194313218 2026-03-08T23:28:10.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:11.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:28:11.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:11.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 90194313218 2026-03-08T23:28:11.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:12.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:28:12.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:12.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 90194313218 -lt 90194313218 2026-03-08T23:28:12.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:28:12.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:28:12.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:28:12.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:28:12.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:28:12.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:28:12.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:28:12.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:28:12.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:28:12.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:28:12.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:28:13.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:28:13.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:28:13.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:28:13.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:28:13.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:28:13.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:28:13.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:28:13.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:218: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_shallow_scrub_chunk_max 2026-03-08T23:28:13.421 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.421 INFO:tasks.workunit.client.0.vm03.stdout: "osd_shallow_scrub_chunk_max": "25" 2026-03-08T23:28:13.421 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:219: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_shallow_scrub_chunk_min 2026-03-08T23:28:13.501 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.501 INFO:tasks.workunit.client.0.vm03.stdout: "osd_shallow_scrub_chunk_min": "5" 2026-03-08T23:28:13.501 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:220: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_pg_stat_report_interval_max_seconds 2026-03-08T23:28:13.579 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.579 INFO:tasks.workunit.client.0.vm03.stdout: "osd_pg_stat_report_interval_max_seconds": "1" 2026-03-08T23:28:13.579 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:221: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_pg_stat_report_interval_max_epochs 2026-03-08T23:28:13.661 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.661 INFO:tasks.workunit.client.0.vm03.stdout: "osd_pg_stat_report_interval_max_epochs": "1" 2026-03-08T23:28:13.661 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:222: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_scrub_chunk_max 2026-03-08T23:28:13.741 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.741 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_chunk_max": "15" 2026-03-08T23:28:13.741 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:223: TEST_scrub_snaps: ceph tell 'osd.*' config get osd_scrub_chunk_min 2026-03-08T23:28:13.820 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:28:13.820 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_chunk_min": "5" 2026-03-08T23:28:13.820 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:28:13.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:225: TEST_scrub_snaps: local pgid=1.0 2026-03-08T23:28:13.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:226: TEST_scrub_snaps: pg_scrub 1.0 2026-03-08T23:28:13.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=1.0 2026-03-08T23:28:13.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 1.0 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:28:13.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:28:13.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:28:14.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids=0 2026-03-08T23:28:14.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:28:14.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:28:14.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=90194313220 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 90194313220 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-90194313220' 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-90194313220 2026-03-08T23:28:14.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:28:14.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:28:14.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-90194313220 2026-03-08T23:28:14.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:28:14.206 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 90194313220 2026-03-08T23:28:14.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=90194313220 2026-03-08T23:28:14.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 90194313220' 2026-03-08T23:28:14.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:14.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 90194313219 -lt 90194313220 2026-03-08T23:28:14.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:15.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:28:15.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:15.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 90194313219 -lt 90194313220 2026-03-08T23:28:15.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:16.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:28:16.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 90194313220 -lt 90194313220 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:28:16.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 1.0 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:28:16.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:28:16.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:16.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 1.0 2026-03-08T23:28:17.121 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.0 to scrub 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 1.0 2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:28:17.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:28:17.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:27:19.157852+0000 '>' 2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:17.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:28:18.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:28:18.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:28:18.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:28:18.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:28:18.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:28:18.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:28:18.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:28:18.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:27:19.157852+0000 '>' 2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:18.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:28:19.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:28:19.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:27:19.157852+0000 '>' 2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:19.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:28:20.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:28:20.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:17.762461+0000 '>' 2026-03-08T23:27:19.157852+0000 2026-03-08T23:28:20.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:28:20.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:230: TEST_scrub_snaps: grep '_scan_snaps start' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:20.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:230: TEST_scrub_snaps: wc -l 2026-03-08T23:28:20.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:230: TEST_scrub_snaps: test 2 = 2 2026-03-08T23:28:20.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:232: TEST_scrub_snaps: rados list-inconsistent-pg test 2026-03-08T23:28:20.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:234: TEST_scrub_snaps: jq '. | length' td/osd-scrub-snaps/json 2026-03-08T23:28:20.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:234: TEST_scrub_snaps: test 1 = 1 2026-03-08T23:28:20.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:236: TEST_scrub_snaps: jq -r '.[0]' td/osd-scrub-snaps/json 2026-03-08T23:28:20.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:236: TEST_scrub_snaps: test 1.0 = 1.0 2026-03-08T23:28:20.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:238: TEST_scrub_snaps: rados list-inconsistent-obj 1.0 2026-03-08T23:28:20.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:243: TEST_scrub_snaps: jq .inconsistents 2026-03-08T23:28:20.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:243: TEST_scrub_snaps: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:28:20.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:250: TEST_scrub_snaps: jq .inconsistents td/osd-scrub-snaps/json 2026-03-08T23:28:20.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:250: TEST_scrub_snaps: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:28:20.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:251: TEST_scrub_snaps: multidiff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:28:20.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:28:20.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:253: TEST_scrub_snaps: rados list-inconsistent-snapset 1.0 2026-03-08T23:28:20.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:255: TEST_scrub_snaps: jq .inconsistents 2026-03-08T23:28:20.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:255: TEST_scrub_snaps: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:28:20.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:634: TEST_scrub_snaps: jq .inconsistents td/osd-scrub-snaps/json 2026-03-08T23:28:20.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:634: TEST_scrub_snaps: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:28:20.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:635: TEST_scrub_snaps: multidiff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:28:20.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:28:20.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:636: TEST_scrub_snaps: test no = yes 2026-03-08T23:28:20.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:641: TEST_scrub_snaps: test '' = yes 2026-03-08T23:28:20.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:646: TEST_scrub_snaps: find td/osd-scrub-snaps 2026-03-08T23:28:20.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:646: TEST_scrub_snaps: grep 'osd[^/]*\.pid' 2026-03-08T23:28:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:646: TEST_scrub_snaps: pidfiles=td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:28:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:647: TEST_scrub_snaps: pids= 2026-03-08T23:28:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:648: TEST_scrub_snaps: for pidfile in ${pidfiles} 2026-03-08T23:28:20.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:650: TEST_scrub_snaps: cat td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:28:20.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:650: TEST_scrub_snaps: pids+='435976 ' 2026-03-08T23:28:20.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:653: TEST_scrub_snaps: ERRORS=0 2026-03-08T23:28:20.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: seq 1 7 2026-03-08T23:28:20.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:20.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap1 2026-03-08T23:28:21.026 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap1 2026-03-08T23:28:21.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap2 2026-03-08T23:28:21.128 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap2 2026-03-08T23:28:21.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap3 2026-03-08T23:28:21.232 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap3 2026-03-08T23:28:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap4 2026-03-08T23:28:21.337 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap4 2026-03-08T23:28:21.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap5 2026-03-08T23:28:21.441 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap5 2026-03-08T23:28:21.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap6 2026-03-08T23:28:21.545 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap6 2026-03-08T23:28:21.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:655: TEST_scrub_snaps: for i in `seq 1 7` 2026-03-08T23:28:21.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:657: TEST_scrub_snaps: rados -p test rmsnap snap7 2026-03-08T23:28:21.649 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap7 2026-03-08T23:28:21.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:659: TEST_scrub_snaps: sleep 5 2026-03-08T23:28:26.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:660: TEST_scrub_snaps: local -i loop=0 2026-03-08T23:28:26.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:661: TEST_scrub_snaps: ceph pg dump pgs 2026-03-08T23:28:26.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:661: TEST_scrub_snaps: grep -q snaptrim 2026-03-08T23:28:26.806 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:28:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:663: TEST_scrub_snaps: ceph pg dump pgs 2026-03-08T23:28:26.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:663: TEST_scrub_snaps: grep -q snaptrim_error 2026-03-08T23:28:26.963 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:28:26.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:665: TEST_scrub_snaps: break 2026-03-08T23:28:26.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:674: TEST_scrub_snaps: ceph pg dump pgs 2026-03-08T23:28:27.121 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T23:28:27.121 INFO:tasks.workunit.client.0.vm03.stdout:1.0 35 0 0 0 0 23416 0 0 60 0 60 active+clean+inconsistent+snaptrim_error 2026-03-08T23:28:21.657534+0000 23'60 29:138 [0] 0 [0] 0 18'56 2026-03-08T23:28:17.762461+0000 0'0 2026-03-08T23:27:19.157852+0000 7 1 periodic scrub scheduled @ 2026-03-10T07:15:55.673393+0000 33 0 2026-03-08T23:28:27.121 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T23:28:27.121 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T23:28:27.121 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:28:27.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:676: TEST_scrub_snaps: for pid in $pids 2026-03-08T23:28:27.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:678: TEST_scrub_snaps: kill -0 435976 2026-03-08T23:28:27.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:685: TEST_scrub_snaps: kill_daemons td/osd-scrub-snaps 2026-03-08T23:28:27.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:28:27.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:28:27.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:28:27.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:28:27.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:28:32.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:28:32.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:687: TEST_scrub_snaps: declare -a err_strings 2026-03-08T23:28:32.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:688: TEST_scrub_snaps: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj10:.* : is missing in clone_overlap' 2026-03-08T23:28:32.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:689: TEST_scrub_snaps: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:7 : no '\''_'\'' attr' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:690: TEST_scrub_snaps: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:7 : is an unexpected clone' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:691: TEST_scrub_snaps: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:4 : on disk size [(]4608[)] does not match object info size [(]512[)] adjusted for ondisk to [(]512[)]' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:692: TEST_scrub_snaps: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj5:head : expected clone .*:::obj5:2' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:693: TEST_scrub_snaps: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj5:head : expected clone .*:::obj5:1' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:694: TEST_scrub_snaps: err_strings[6]='log_channel[(]cluster[)] log [[]INF[]] : scrub [0-9]*[.]0 .*:::obj5:head : 2 missing clone[(]s[)]' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:695: TEST_scrub_snaps: err_strings[7]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj8:head : snaps.seq not set' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:696: TEST_scrub_snaps: err_strings[8]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj7:1 : is an unexpected clone' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:697: TEST_scrub_snaps: err_strings[9]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj3:head : on disk size [(]3840[)] does not match object info size [(]768[)] adjusted for ondisk to [(]768[)]' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:698: TEST_scrub_snaps: err_strings[10]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj6:1 : is an unexpected clone' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:699: TEST_scrub_snaps: err_strings[11]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:head : no '\''snapset'\'' attr' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:700: TEST_scrub_snaps: err_strings[12]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:7 : clone ignored due to missing snapset' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:701: TEST_scrub_snaps: err_strings[13]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:4 : clone ignored due to missing snapset' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:702: TEST_scrub_snaps: err_strings[14]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj4:head : expected clone .*:::obj4:7' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:703: TEST_scrub_snaps: err_strings[15]='log_channel[(]cluster[)] log [[]INF[]] : scrub [0-9]*[.]0 .*:::obj4:head : 1 missing clone[(]s[)]' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:704: TEST_scrub_snaps: err_strings[16]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj1:1 : is an unexpected clone' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:705: TEST_scrub_snaps: err_strings[17]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj9:1 : is missing in clone_size' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:706: TEST_scrub_snaps: err_strings[18]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj11:1 : is an unexpected clone' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:707: TEST_scrub_snaps: err_strings[19]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj14:1 : size 1032 != clone_size 1033' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:708: TEST_scrub_snaps: err_strings[20]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 20 errors' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:709: TEST_scrub_snaps: err_strings[21]='log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj15:head : can'\''t decode '\''snapset'\'' attr ' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:710: TEST_scrub_snaps: err_strings[22]='log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj10:.* : is missing in clone_overlap' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:7 : no '\''_'\'' attr' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:7 : is an unexpected clone' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*::obj5:4 : on disk size [(]4608[)] does not match object info size [(]512[)] adjusted for ondisk to [(]512[)]' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj5:head : expected clone .*:::obj5:2' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj5:head : expected clone .*:::obj5:1' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]INF[]] : scrub [0-9]*[.]0 .*:::obj5:head : 2 missing clone[(]s[)]' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj8:head : snaps.seq not set' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj7:1 : is an unexpected clone' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj3:head : on disk size [(]3840[)] does not match object info size [(]768[)] adjusted for ondisk to [(]768[)]' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj6:1 : is an unexpected clone' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:head : no '\''snapset'\'' attr' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:7 : clone ignored due to missing snapset' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj2:4 : clone ignored due to missing snapset' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj4:head : expected clone .*:::obj4:7' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]INF[]] : scrub [0-9]*[.]0 .*:::obj4:head : 1 missing clone[(]s[)]' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj1:1 : is an unexpected clone' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj9:1 : is missing in clone_size' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj11:1 : is an unexpected clone' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj14:1 : size 1032 != clone_size 1033' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 20 errors' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : scrub [0-9]*[.]0 .*:::obj15:head : can'\''t decode '\''snapset'\'' attr ' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:712: TEST_scrub_snaps: for err_string in "${err_strings[@]}" 2026-03-08T23:28:32.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:714: TEST_scrub_snaps: grep 'log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:721: TEST_scrub_snaps: '[' 0 '!=' 0 ']' 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:727: TEST_scrub_snaps: echo 'TEST PASSED' 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stdout:TEST PASSED 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:728: TEST_scrub_snaps: return 0 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:40: run: teardown td/osd-scrub-snaps 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:28:32.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:28:32.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:28:32.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:28:32.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:28:32.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:28:32.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:28:32.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:28:32.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:28:32.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:28:32.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:28:32.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:28:32.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:28:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:28:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:28:32.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:28:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:28:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:37: run: for func in $funcs 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:38: run: setup td/osd-scrub-snaps 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-snaps 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:28:32.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:28:32.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:28:32.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:28:32.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:28:32.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:28:32.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:28:32.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:28:32.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:28:32.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:28:32.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:28:32.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:28:32.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:28:32.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:28:32.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:28:32.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:28:32.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:28:32.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:28:32.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:28:32.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:28:32.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-snaps 2026-03-08T23:28:32.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:28:32.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.420670 2026-03-08T23:28:32.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-snaps 1' TERM HUP INT 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:39: run: TEST_scrub_snaps_primary td/osd-scrub-snaps 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1177: TEST_scrub_snaps_primary: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1178: TEST_scrub_snaps_primary: ORIG_ARGS='--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1179: TEST_scrub_snaps_primary: CEPH_ARGS+=' --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1180: TEST_scrub_snaps_primary: _scrub_snaps_multi td/osd-scrub-snaps primary 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:732: _scrub_snaps_multi: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:733: _scrub_snaps_multi: local poolname=test 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:734: _scrub_snaps_multi: local OBJS=16 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:735: _scrub_snaps_multi: local OSDS=2 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:736: _scrub_snaps_multi: local which=primary 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:738: _scrub_snaps_multi: TESTDATA=testdata.420670 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:740: _scrub_snaps_multi: run_mon td/osd-scrub-snaps a --osd_pool_default_size=2 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-snaps/a 2026-03-08T23:28:32.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-snaps/a --run-dir=td/osd-scrub-snaps --osd_pool_default_size=2 2026-03-08T23:28:32.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:28:32.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:28:32.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:28:32.542 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:28:32.542 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.543 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:32.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-snaps/a '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-snaps/log --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:28:32.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:28:32.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:28:32.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:28:32.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:28:32.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:28:32.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:28:32.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:28:32.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:28:32.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:28:32.580 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:28:32.580 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.580 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.581 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:28:32.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:28:32.581 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get fsid 2026-03-08T23:28:32.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:28:32.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:28:32.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:28:32.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.652 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.653 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:28:32.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:28:32.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get mon_host 2026-03-08T23:28:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:741: _scrub_snaps_multi: run_mgr td/osd-scrub-snaps x 2026-03-08T23:28:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:28:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:28:32.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:28:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-snaps/x 2026-03-08T23:28:32.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:32.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:28:32.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-snaps/x '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:28:32.854 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: expr 2 - 1 2026-03-08T23:28:32.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: seq 0 1 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:744: _scrub_snaps_multi: run_osd td/osd-scrub-snaps 0 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:32.860 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:28:32.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:28:32.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:28:32.862 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 2026-03-08T23:28:32.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 2026-03-08T23:28:32.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9' 2026-03-08T23:28:32.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:28:32.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAgBq5pAb2+NBAAMGVqfoiG+bxUF3d2efAKzg== 2026-03-08T23:28:32.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAgBq5pAb2+NBAAMGVqfoiG+bxUF3d2efAKzg=="}' 2026-03-08T23:28:32.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 -i td/osd-scrub-snaps/0/new.json 2026-03-08T23:28:32.981 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:28:32.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-snaps/0/new.json 2026-03-08T23:28:32.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAgBq5pAb2+NBAAMGVqfoiG+bxUF3d2efAKzg== --osd-uuid d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 2026-03-08T23:28:33.016 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:33.020+0000 7f5c7eefa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:33.022 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:33.024+0000 7f5c7eefa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:33.031 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:33.024+0000 7f5c7eefa8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:33.032 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:33.024+0000 7f5c7eefa8c0 -1 bdev(0x557d4761ec00 td/osd-scrub-snaps/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:28:33.032 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:33.024+0000 7f5c7eefa8c0 -1 bluestore(td/osd-scrub-snaps/0) _read_fsid unparsable uuid 2026-03-08T23:28:35.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-snaps/0/keyring 2026-03-08T23:28:35.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:28:35.274 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:28:35.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:28:35.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-snaps/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:28:35.403 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:28:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:28:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:28:35.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:28:35.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:28:35.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:28:35.424 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:35.428+0000 7f3230bbf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:35.426 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:35.428+0000 7f3230bbf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:35.428 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:35.428+0000 7f3230bbf8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:35.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:35.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:35.877 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:35.880+0000 7f3230bbf8c0 -1 Falling back to public interface 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:36.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:36.859 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:36.860+0000 7f3230bbf8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:28:36.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:37.805 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:37.812+0000 7f322c378640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:28:37.945 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:28:37.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:37.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:37.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:28:37.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:37.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:28:38.122 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/642384440,v1:127.0.0.1:6803/642384440] [v2:127.0.0.1:6804/642384440,v1:127.0.0.1:6805/642384440] exists,up d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:744: _scrub_snaps_multi: run_osd td/osd-scrub-snaps 1 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-snaps/1 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/1' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/1/journal' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:28:38.123 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:28:38.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-snaps/1 2026-03-08T23:28:38.125 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:28:38.125 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 f6820cf7-1215-46c2-a267-6952058a734a 2026-03-08T23:28:38.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=f6820cf7-1215-46c2-a267-6952058a734a 2026-03-08T23:28:38.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 f6820cf7-1215-46c2-a267-6952058a734a' 2026-03-08T23:28:38.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:28:38.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAmBq5p9JXLCBAAHr89YbuKHwFUJCV9nIVX9Q== 2026-03-08T23:28:38.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAmBq5p9JXLCBAAHr89YbuKHwFUJCV9nIVX9Q=="}' 2026-03-08T23:28:38.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new f6820cf7-1215-46c2-a267-6952058a734a -i td/osd-scrub-snaps/1/new.json 2026-03-08T23:28:38.300 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:28:38.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-snaps/1/new.json 2026-03-08T23:28:38.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQAmBq5p9JXLCBAAHr89YbuKHwFUJCV9nIVX9Q== --osd-uuid f6820cf7-1215-46c2-a267-6952058a734a 2026-03-08T23:28:38.334 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:38.336+0000 7ff07ce398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:38.336 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:38.340+0000 7ff07ce398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:38.337 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:38.340+0000 7ff07ce398c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:38.337 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:38.340+0000 7ff07ce398c0 -1 bdev(0x55dff264bc00 td/osd-scrub-snaps/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:28:38.337 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:38.340+0000 7ff07ce398c0 -1 bluestore(td/osd-scrub-snaps/1) _read_fsid unparsable uuid 2026-03-08T23:28:40.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-snaps/1/keyring 2026-03-08T23:28:40.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:28:40.601 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:28:40.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:28:40.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-snaps/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:28:40.801 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:28:40.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:28:40.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:28:40.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:28:40.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:28:40.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:28:40.827 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:40.828+0000 7fea6dde08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:40.839 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:40.840+0000 7fea6dde08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:40.847 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:40.844+0000 7fea6dde08c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:40.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:28:41.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:41.296 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:41.300+0000 7fea6dde08c0 -1 Falling back to public interface 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:42.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:28:42.261 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:28:42.264+0000 7fea6dde08c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:28:42.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:28:43.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:28:43.495 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 9 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1237280738,v1:127.0.0.1:6811/1237280738] [v2:127.0.0.1:6812/1237280738,v1:127.0.0.1:6813/1237280738] exists,up f6820cf7-1215-46c2-a267-6952058a734a 2026-03-08T23:28:43.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:28:43.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:28:43.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:28:43.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:748: _scrub_snaps_multi: ceph osd set noscrub 2026-03-08T23:28:43.701 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:28:43.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:749: _scrub_snaps_multi: ceph osd set nodeep-scrub 2026-03-08T23:28:43.907 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:28:43.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:752: _scrub_snaps_multi: create_pool test 1 1 2026-03-08T23:28:43.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:28:44.116 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:28:44.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:28:45.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:753: _scrub_snaps_multi: wait_for_clean 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:28:45.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:28:45.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:28:45.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:28:45.351 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:28:45.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:28:45.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:28:45.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:28:45.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:28:45.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:28:45.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:28:45.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:28:45.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:28:45.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=38654705666 2026-03-08T23:28:45.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 38654705666 2026-03-08T23:28:45.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-38654705666' 2026-03-08T23:28:45.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:28:45.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:28:45.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:28:45.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:28:45.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:28:45.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:28:45.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:28:45.520 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:28:45.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:28:45.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:45.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:28:45.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:46.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:28:46.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:46.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:28:46.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:28:47.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:28:47.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:28:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836483 2026-03-08T23:28:48.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:28:48.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-38654705666 2026-03-08T23:28:48.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:28:48.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:28:48.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-38654705666 2026-03-08T23:28:48.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:28:48.021 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 38654705666 2026-03-08T23:28:48.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=38654705666 2026-03-08T23:28:48.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 38654705666' 2026-03-08T23:28:48.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:28:48.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 38654705666 -lt 38654705666 2026-03-08T23:28:48.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:28:48.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:28:48.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:28:48.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:28:48.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:28:48.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:28:48.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:28:48.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:28:48.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:28:48.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:28:48.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:28:48.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:28:48.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:28:48.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: ceph osd dump 2026-03-08T23:28:48.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: awk '{ print $2 }' 2026-03-08T23:28:48.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: grep '^pool.*['\'']test['\'']' 2026-03-08T23:28:48.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: poolid=1 2026-03-08T23:28:48.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:756: _scrub_snaps_multi: dd if=/dev/urandom of=testdata.420670 bs=1032 count=1 2026-03-08T23:28:48.932 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:28:48.932 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:28:48.932 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 6.2487e-05 s, 16.5 MB/s 2026-03-08T23:28:48.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: seq 1 16 2026-03-08T23:28:48.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:48.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj1 testdata.420670 2026-03-08T23:28:48.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:48.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj2 testdata.420670 2026-03-08T23:28:48.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:48.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj3 testdata.420670 2026-03-08T23:28:49.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj4 testdata.420670 2026-03-08T23:28:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj5 testdata.420670 2026-03-08T23:28:49.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj6 testdata.420670 2026-03-08T23:28:49.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj7 testdata.420670 2026-03-08T23:28:49.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.099 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj8 testdata.420670 2026-03-08T23:28:49.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj9 testdata.420670 2026-03-08T23:28:49.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj10 testdata.420670 2026-03-08T23:28:49.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj11 testdata.420670 2026-03-08T23:28:49.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj12 testdata.420670 2026-03-08T23:28:49.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj13 testdata.420670 2026-03-08T23:28:49.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj14 testdata.420670 2026-03-08T23:28:49.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj15 testdata.420670 2026-03-08T23:28:49.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:28:49.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj16 testdata.420670 2026-03-08T23:28:49.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:762: _scrub_snaps_multi: get_primary test obj1 2026-03-08T23:28:49.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:28:49.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:28:49.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:28:49.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:28:49.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:762: _scrub_snaps_multi: local primary=1 2026-03-08T23:28:49.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:763: _scrub_snaps_multi: get_not_primary test obj1 2026-03-08T23:28:49.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=test 2026-03-08T23:28:49.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T23:28:49.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary test obj1 2026-03-08T23:28:49.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:28:49.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:28:49.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:28:49.490 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:28:49.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:28:49.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map test obj1 2026-03-08T23:28:49.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:763: _scrub_snaps_multi: local replica=0 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:765: _scrub_snaps_multi: eval create_scenario td/osd-scrub-snaps test testdata.420670 '$primary' 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:765: _scrub_snaps_multi: create_scenario td/osd-scrub-snaps test testdata.420670 1 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:45: create_scenario: local dir=td/osd-scrub-snaps 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:46: create_scenario: local poolname=test 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:47: create_scenario: local TESTDATA=testdata.420670 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:48: create_scenario: local osd=1 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:50: create_scenario: SNAP=1 2026-03-08T23:28:49.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:51: create_scenario: rados -p test mksnap snap1 2026-03-08T23:28:49.901 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap1 2026-03-08T23:28:49.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:52: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=1 2026-03-08T23:28:49.904 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:28:49.904 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:28:49.904 INFO:tasks.workunit.client.0.vm03.stderr:256 bytes copied, 4.9463e-05 s, 5.2 MB/s 2026-03-08T23:28:49.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:53: create_scenario: rados -p test put obj1 testdata.420670 2026-03-08T23:28:49.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:54: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:28:49.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:55: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:28:49.974 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: seq 6 14 2026-03-08T23:28:49.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:49.975 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj6 testdata.420670 2026-03-08T23:28:50.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj7 testdata.420670 2026-03-08T23:28:50.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj8 testdata.420670 2026-03-08T23:28:50.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj9 testdata.420670 2026-03-08T23:28:50.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj10 testdata.420670 2026-03-08T23:28:50.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj11 testdata.420670 2026-03-08T23:28:50.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj12 testdata.420670 2026-03-08T23:28:50.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj13 testdata.420670 2026-03-08T23:28:50.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:28:50.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj14 testdata.420670 2026-03-08T23:28:50.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:60: create_scenario: SNAP=2 2026-03-08T23:28:50.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:61: create_scenario: rados -p test mksnap snap2 2026-03-08T23:28:50.269 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap2 2026-03-08T23:28:50.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:62: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=2 2026-03-08T23:28:50.272 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records in 2026-03-08T23:28:50.272 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records out 2026-03-08T23:28:50.272 INFO:tasks.workunit.client.0.vm03.stderr:512 bytes copied, 7.1885e-05 s, 7.1 MB/s 2026-03-08T23:28:50.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:63: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:28:50.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:65: create_scenario: SNAP=3 2026-03-08T23:28:50.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:66: create_scenario: rados -p test mksnap snap3 2026-03-08T23:28:50.370 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap3 2026-03-08T23:28:50.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:67: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=3 2026-03-08T23:28:50.375 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records in 2026-03-08T23:28:50.375 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records out 2026-03-08T23:28:50.375 INFO:tasks.workunit.client.0.vm03.stderr:768 bytes copied, 7.476e-05 s, 10.3 MB/s 2026-03-08T23:28:50.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:68: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:28:50.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:70: create_scenario: SNAP=4 2026-03-08T23:28:50.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:71: create_scenario: rados -p test mksnap snap4 2026-03-08T23:28:50.476 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap4 2026-03-08T23:28:50.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:72: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=4 2026-03-08T23:28:50.479 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records in 2026-03-08T23:28:50.479 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records out 2026-03-08T23:28:50.479 INFO:tasks.workunit.client.0.vm03.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 0.000114615 s, 8.9 MB/s 2026-03-08T23:28:50.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:73: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:28:50.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:74: create_scenario: rados -p test put obj2 testdata.420670 2026-03-08T23:28:50.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:76: create_scenario: SNAP=5 2026-03-08T23:28:50.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:77: create_scenario: rados -p test mksnap snap5 2026-03-08T23:28:50.580 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap5 2026-03-08T23:28:50.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:78: create_scenario: SNAP=6 2026-03-08T23:28:50.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:79: create_scenario: rados -p test mksnap snap6 2026-03-08T23:28:50.685 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap6 2026-03-08T23:28:50.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:80: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=6 2026-03-08T23:28:50.689 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records in 2026-03-08T23:28:50.689 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records out 2026-03-08T23:28:50.689 INFO:tasks.workunit.client.0.vm03.stderr:1536 bytes (1.5 kB, 1.5 KiB) copied, 7.9309e-05 s, 19.4 MB/s 2026-03-08T23:28:50.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:81: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:28:50.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:83: create_scenario: SNAP=7 2026-03-08T23:28:50.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:84: create_scenario: rados -p test mksnap snap7 2026-03-08T23:28:50.789 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap7 2026-03-08T23:28:50.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:86: create_scenario: rados -p test rm obj4 2026-03-08T23:28:50.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:87: create_scenario: rados -p test rm obj16 2026-03-08T23:28:50.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:88: create_scenario: rados -p test rm obj2 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:90: create_scenario: kill_daemons td/osd-scrub-snaps TERM osd 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:28:50.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:28:50.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:28:50.969 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj1 2026-03-08T23:28:51.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: JSON='["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:51.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:95: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' --force remove 2026-03-08T23:28:52.486 INFO:tasks.workunit.client.0.vm03.stdout:WARNING: only removing head with clones present 2026-03-08T23:28:52.486 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T23:28:53.019 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --op list obj5 2026-03-08T23:28:53.020 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: grep '"snapid":2' 2026-03-08T23:28:53.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:53.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:98: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:28:54.498 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:2# 2026-03-08T23:28:55.031 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --op list obj5 2026-03-08T23:28:55.031 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: grep '"snapid":1' 2026-03-08T23:28:55.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:55.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:101: create_scenario: OBJ5SAVE='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:28:55.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:103: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/1 list 2026-03-08T23:28:56.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:104: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.947 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:105: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:28:56.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:106: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --rmtype nosnapmap '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:28:57.573 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:1# 2026-03-08T23:28:58.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:108: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/1 list 2026-03-08T23:28:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:109: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.184 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:110: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:28:59.185 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:111: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:28:59.186 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --op list obj5 2026-03-08T23:28:59.186 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: grep '"snapid":4' 2026-03-08T23:29:00.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:00.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:114: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=18 2026-03-08T23:29:00.024 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records in 2026-03-08T23:29:00.024 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records out 2026-03-08T23:29:00.024 INFO:tasks.workunit.client.0.vm03.stderr:4608 bytes (4.6 kB, 4.5 KiB) copied, 0.000129452 s, 35.6 MB/s 2026-03-08T23:29:00.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:115: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:29:01.195 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj3 2026-03-08T23:29:02.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: JSON='["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:02.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:118: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=15 2026-03-08T23:29:02.040 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records in 2026-03-08T23:29:02.040 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records out 2026-03-08T23:29:02.040 INFO:tasks.workunit.client.0.vm03.stderr:3840 bytes (3.8 kB, 3.8 KiB) copied, 9.5178e-05 s, 40.3 MB/s 2026-03-08T23:29:02.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:119: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:29:03.208 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --op list obj4 2026-03-08T23:29:03.208 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: grep '"snapid":7' 2026-03-08T23:29:04.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: JSON='["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:04.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:122: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:29:04.727 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:0ee9ae15:::obj4:7# 2026-03-08T23:29:05.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:125: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/1 list 2026-03-08T23:29:06.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:126: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:06.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:127: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --op list obj16 2026-03-08T23:29:06.340 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: grep '"snapid":7' 2026-03-08T23:29:07.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: JSON='["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:07.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:129: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --rmtype snapmap '["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:29:07.819 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:461f8b5e:::obj16:7# 2026-03-08T23:29:08.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:131: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/1 list 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:132: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:29:09.431 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:133: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:29:09.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:134: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:29:09.433 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj2 2026-03-08T23:29:10.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: JSON='["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:10.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:137: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' rm-attr snapset 2026-03-08T23:29:11.447 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: echo '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:11.447 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: sed 's/snapid":1/snapid":7/' 2026-03-08T23:29:11.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:11.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:141: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=7 2026-03-08T23:29:11.449 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records in 2026-03-08T23:29:11.449 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records out 2026-03-08T23:29:11.449 INFO:tasks.workunit.client.0.vm03.stderr:1792 bytes (1.8 kB, 1.8 KiB) copied, 0.000100749 s, 17.8 MB/s 2026-03-08T23:29:11.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:142: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:29:12.615 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj6 2026-03-08T23:29:12.923 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:13.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: JSON='["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:13.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:145: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset 2026-03-08T23:29:14.631 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj7 2026-03-08T23:29:14.939 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:15.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: JSON='["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:15.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:147: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset corrupt 2026-03-08T23:29:16.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj8 2026-03-08T23:29:16.947 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:17.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: JSON='["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:17.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:149: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset seq 2026-03-08T23:29:18.643 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj9 2026-03-08T23:29:18.951 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:19.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: JSON='["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:19.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:151: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_size 2026-03-08T23:29:20.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj10 2026-03-08T23:29:20.964 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:21.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: JSON='["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:21.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:153: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_overlap 2026-03-08T23:29:22.667 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj11 2026-03-08T23:29:22.975 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:23.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: JSON='["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:23.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:155: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clones 2026-03-08T23:29:24.967 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj12 2026-03-08T23:29:25.275 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:25.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: JSON='["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:25.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:157: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset head 2026-03-08T23:29:26.976 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj13 2026-03-08T23:29:27.283 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:27.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: JSON='["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:27.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:159: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset snaps 2026-03-08T23:29:29.147 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj14 2026-03-08T23:29:29.460 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:29.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: JSON='["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:29.996 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:161: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset size 2026-03-08T23:29:31.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:163: create_scenario: echo garbage 2026-03-08T23:29:31.166 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 --head --op list obj15 2026-03-08T23:29:31.474 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:29:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: JSON='["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:29:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:165: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/1 '["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' set-attr snapset td/osd-scrub-snaps/bad 2026-03-08T23:29:33.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:166: create_scenario: rm -f td/osd-scrub-snaps/bad 2026-03-08T23:29:33.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:167: create_scenario: return 0 2026-03-08T23:29:33.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:767: _scrub_snaps_multi: rm -f testdata.420670 2026-03-08T23:29:33.173 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: expr 2 - 1 2026-03-08T23:29:33.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: seq 0 1 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:771: _scrub_snaps_multi: activate_osd td/osd-scrub-snaps 0 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:29:33.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:29:33.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:29:33.177 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:29:33.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:29:33.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:29:33.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-snaps/0/whoami 2026-03-08T23:29:33.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:29:33.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:29:33.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:29:33.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:29:33.196 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:33.200+0000 7fa2229388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:33.204 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:33.208+0000 7fa2229388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:33.207 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:33.208+0000 7fa2229388c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:33.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:29:33.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:34.417 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:34.420+0000 7fa2229388c0 -1 Falling back to public interface 2026-03-08T23:29:34.517 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:29:34.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:34.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:34.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:29:34.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:34.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:29:34.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:35.408 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:35.412+0000 7fa2229388c0 -1 osd.0 22 log_to_monitors true 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:35.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:29:35.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:36.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 25 up_thru 25 down_at 23 last_clean_interval [5,22) [v2:127.0.0.1:6802/1500136262,v1:127.0.0.1:6803/1500136262] [v2:127.0.0.1:6804/1500136262,v1:127.0.0.1:6805/1500136262] exists,up d9f2ab9c-cf1e-4f64-9803-1a7e34ca33a9 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:771: _scrub_snaps_multi: activate_osd td/osd-scrub-snaps 1 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-snaps/1 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/1' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/1/journal' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:29:37.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:29:37.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-snaps/1 2026-03-08T23:29:37.037 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:29:37.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:29:37.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:29:37.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-snaps/1/whoami 2026-03-08T23:29:37.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:29:37.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:29:37.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:29:37.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:29:37.056 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:37.060+0000 7f5630a2c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:37.064 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:37.068+0000 7f5630a2c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:37.065 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:37.068+0000 7f5630a2c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:37.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:29:37.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:38.288 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:38.292+0000 7f5630a2c8c0 -1 Falling back to public interface 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:38.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:29:38.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:39.259 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:29:39.260+0000 7f5630a2c8c0 -1 osd.1 22 log_to_monitors true 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:39.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:29:39.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:29:40.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:29:40.895 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 28 up_thru 28 down_at 23 last_clean_interval [9,22) [v2:127.0.0.1:6810/528715299,v1:127.0.0.1:6811/528715299] [v2:127.0.0.1:6812/528715299,v1:127.0.0.1:6813/528715299] exists,up f6820cf7-1215-46c2-a267-6952058a734a 2026-03-08T23:29:40.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:29:40.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:29:40.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:29:40.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:774: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_max 3 2026-03-08T23:29:41.044 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:29:41.044 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:29:41.044 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.051 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:29:41.051 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:29:41.051 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:775: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_min 3 2026-03-08T23:29:41.131 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:29:41.147 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:29:41.147 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.147 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:29:41.147 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:29:41.147 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:776: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_scrub_chunk_min 3 2026-03-08T23:29:41.252 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:29:41.252 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:29:41.252 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.258 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:29:41.292 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:29:41.292 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:777: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_seconds 1 2026-03-08T23:29:41.339 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:29:41.339 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_seconds = '' (not observed, change may require restart) " 2026-03-08T23:29:41.339 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.346 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:29:41.346 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_seconds = '' (not observed, change may require restart) " 2026-03-08T23:29:41.346 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:778: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_epochs 1 2026-03-08T23:29:41.430 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:29:41.431 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_epochs = '' (not observed, change may require restart) " 2026-03-08T23:29:41.431 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.437 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:29:41.437 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_epochs = '' (not observed, change may require restart) " 2026-03-08T23:29:41.437 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:41.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:779: _scrub_snaps_multi: wait_for_clean 2026-03-08T23:29:41.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:29:41.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:29:41.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:29:41.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:29:41.449 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:29:41.449 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:29:41.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:29:41.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:29:41.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:29:41.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:29:41.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:29:41.665 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:29:41.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:29:41.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:29:41.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:29:41.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182403 2026-03-08T23:29:41.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182403 2026-03-08T23:29:41.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182403' 2026-03-08T23:29:41.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:29:41.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:29:41.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084290 2026-03-08T23:29:41.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084290 2026-03-08T23:29:41.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182403 1-120259084290' 2026-03-08T23:29:41.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:29:41.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182403 2026-03-08T23:29:41.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:29:41.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:29:41.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182403 2026-03-08T23:29:41.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:29:41.827 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 107374182403 2026-03-08T23:29:41.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182403 2026-03-08T23:29:41.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182403' 2026-03-08T23:29:41.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:41.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182402 -lt 107374182403 2026-03-08T23:29:41.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:29:42.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:29:42.994 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:43.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182402 -lt 107374182403 2026-03-08T23:29:43.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:29:44.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:29:44.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:44.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182403 2026-03-08T23:29:44.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:29:44.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084290 2026-03-08T23:29:44.319 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:29:44.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:29:44.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084290 2026-03-08T23:29:44.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:29:44.322 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 120259084290 2026-03-08T23:29:44.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084290 2026-03-08T23:29:44.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084290' 2026-03-08T23:29:44.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:29:44.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084290 -lt 120259084290 2026-03-08T23:29:44.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:29:44.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:29:44.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:29:44.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:29:44.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:29:44.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:29:44.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:29:44.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:781: _scrub_snaps_multi: local pgid=1.0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:782: _scrub_snaps_multi: pg_scrub 1.0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=1.0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 1.0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:29:45.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:29:45.055 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:29:45.055 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:29:45.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:29:45.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:29:45.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:29:45.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:29:45.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:29:45.352 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:29:45.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:29:45.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:29:45.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:29:45.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=107374182404 2026-03-08T23:29:45.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 107374182404 2026-03-08T23:29:45.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182404' 2026-03-08T23:29:45.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:29:45.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:29:45.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084292 2026-03-08T23:29:45.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084292 2026-03-08T23:29:45.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-107374182404 1-120259084292' 2026-03-08T23:29:45.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:29:45.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-107374182404 2026-03-08T23:29:45.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:29:45.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:29:45.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-107374182404 2026-03-08T23:29:45.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:29:45.514 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 107374182404 2026-03-08T23:29:45.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=107374182404 2026-03-08T23:29:45.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 107374182404' 2026-03-08T23:29:45.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:45.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182404 2026-03-08T23:29:45.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:29:46.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:29:46.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:46.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182403 -lt 107374182404 2026-03-08T23:29:46.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:29:47.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:29:47.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:29:48.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 107374182405 -lt 107374182404 2026-03-08T23:29:48.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:29:48.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084292 2026-03-08T23:29:48.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:29:48.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:29:48.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084292 2026-03-08T23:29:48.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:29:48.023 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 120259084292 2026-03-08T23:29:48.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084292 2026-03-08T23:29:48.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084292' 2026-03-08T23:29:48.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:29:48.188 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084292 -lt 120259084292 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:29:48.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 1.0 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:48.271 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:48.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:48.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 1.0 2026-03-08T23:29:48.584 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to scrub 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 1.0 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:48.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:48.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:48.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:49.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:49.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:49.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:50.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:51.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:51.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:52.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:52.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:53.289 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:53.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:54.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:54.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:54.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:54.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:28:44.120798+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:54.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:29:55.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:29:55.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:29:49.423015+0000 '>' 2026-03-08T23:28:44.120798+0000 2026-03-08T23:29:55.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:29:55.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: grep '_scan_snaps start' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:29:55.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: wc -l 2026-03-08T23:29:55.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: test 14 -gt 3 2026-03-08T23:29:55.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: grep '_scan_snaps start' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:29:55.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: wc -l 2026-03-08T23:29:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: test 14 -gt 3 2026-03-08T23:29:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:789: _scrub_snaps_multi: rados list-inconsistent-pg test 2026-03-08T23:29:55.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:791: _scrub_snaps_multi: jq '. | length' td/osd-scrub-snaps/json 2026-03-08T23:29:55.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:791: _scrub_snaps_multi: test 1 = 1 2026-03-08T23:29:55.815 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:793: _scrub_snaps_multi: jq -r '.[0]' td/osd-scrub-snaps/json 2026-03-08T23:29:55.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:793: _scrub_snaps_multi: test 1.0 = 1.0 2026-03-08T23:29:55.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:795: _scrub_snaps_multi: rados list-inconsistent-obj 1.0 --format=json-pretty 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 28, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "inconsistents": [ 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "version": 5 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 1, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'19", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'5", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4179.0:1", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 5, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:49.055766+0000", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:49.056438+0000", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x56010eda", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.843 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 2, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "version": 20 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 2, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'41", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'20", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4227.0:1", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 20, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:49.955119+0000", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:49.955726+0000", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.844 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 4, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "version": 42 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 4, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19'45", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "17'42", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4263.0:1", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 42, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.300371+0000", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.302410+0000", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x65c12660", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:29:55.845 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "size": 4608, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "object_info": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 4, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19'45", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "17'42", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4263.0:1", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 42, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.300371+0000", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.302410+0000", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x65c12660", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj4", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "version": 4 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj4", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 7, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2826278768, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "version": "22'51", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'4", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4176.0:1", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 4, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:49.032881+0000", 2026-03-08T23:29:55.846 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:49.033342+0000", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x56010eda", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "version": 0 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "missing", 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "info_missing" 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "info_missing" 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1792 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.847 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj1", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "version": 18 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj1", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1828249343, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'18", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'1", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4224.0:1", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 18, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:49.932175+0000", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:49.932737+0000", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj10", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "version": 32 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.848 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj10", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 718195851, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'32", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'10", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4245.0:1", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 32, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.104278+0000", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.104837+0000", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.849 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "????", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj11", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "version": 34 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj11", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 693400951, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'34", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'11", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4248.0:1", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 34, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.128227+0000", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.128729+0000", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.850 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj13", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "version": 38 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj13", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2087409765, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'38", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'13", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4254.0:1", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 38, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.175725+0000", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.176546+0000", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.851 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]" 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj14", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "version": 40 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj14", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2484217095, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'40", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'14", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4257.0:1", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 40, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.201125+0000", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.201635+0000", 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.852 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1033, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj15", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "version": 15 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_corrupted" 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.853 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj15", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 612772309, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "version": "15'15", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "0'0", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4209.0:1", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 15, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:49.294783+0000", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:49.295506+0000", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x56010eda", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_corrupted" 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": "Z2FyYmFnZQo=" 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj2", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "version": 56 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_missing" 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.854 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj2", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1058988552, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "version": "22'56", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "19'48", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4299.0:1", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 56, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.867783+0000", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.868356+0000", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "whiteout", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "dirty" 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0xffffffff", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 7, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 4, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 4, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 3, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 2, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1024, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 7, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 6, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: 5 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.855 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_missing" 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj3", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "version": 44 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch" 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj3", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1643547569, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "version": "18'44", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'22", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4269.0:1", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 44, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.403900+0000", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.404476+0000", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x848fda93", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.856 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "size": 3840, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "object_info": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj3", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1643547569, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "version": "18'44", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'22", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4269.0:1", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 44, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.403900+0000", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.404476+0000", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x848fda93", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj6", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "version": 24 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj6", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2202164420, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'24", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'6", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4233.0:1", 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 24, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.857 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.004259+0000", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.004995+0000", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj7", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "version": 26 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj7", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.858 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1552453721, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'26", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'7", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4236.0:1", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 26, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.031667+0000", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.032328+0000", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.859 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj8", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "version": 28 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj8", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2381834917, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'28", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'8", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4239.0:1", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 28, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.056864+0000", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.057383+0000", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.860 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj9", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "version": 30 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj9", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 3833113727, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'30", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "15'9", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4242.0:1", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 30, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:28:50.080248+0000", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:28:50.080797+0000", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x9c3e29b8", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.861 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "size": "????", 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:29:55.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:797: _scrub_snaps_multi: rados list-inconsistent-snapset 1.0 2026-03-08T23:29:55.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:800: _scrub_snaps_multi: '[' primary = replica ']' 2026-03-08T23:29:55.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:811: _scrub_snaps_multi: scruberrors=30 2026-03-08T23:29:55.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:812: _scrub_snaps_multi: jq .inconsistents 2026-03-08T23:29:55.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:812: _scrub_snaps_multi: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:29:55.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1066: _scrub_snaps_multi: jq .inconsistents td/osd-scrub-snaps/json 2026-03-08T23:29:55.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1066: _scrub_snaps_multi: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:29:55.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1067: _scrub_snaps_multi: multidiff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:29:55.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:29:55.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1068: _scrub_snaps_multi: test no = yes 2026-03-08T23:29:55.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1073: _scrub_snaps_multi: test '' = yes 2026-03-08T23:29:55.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: find td/osd-scrub-snaps 2026-03-08T23:29:55.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: grep 'osd[^/]*\.pid' 2026-03-08T23:29:55.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: pidfiles='td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:29:55.889 INFO:tasks.workunit.client.0.vm03.stderr:td/osd-scrub-snaps/osd.1.pid' 2026-03-08T23:29:55.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1079: _scrub_snaps_multi: pids= 2026-03-08T23:29:55.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1080: _scrub_snaps_multi: for pidfile in ${pidfiles} 2026-03-08T23:29:55.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: cat td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:29:55.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: pids+='454502 ' 2026-03-08T23:29:55.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1080: _scrub_snaps_multi: for pidfile in ${pidfiles} 2026-03-08T23:29:55.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: cat td/osd-scrub-snaps/osd.1.pid 2026-03-08T23:29:55.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: pids+='455066 ' 2026-03-08T23:29:55.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1085: _scrub_snaps_multi: ERRORS=0 2026-03-08T23:29:55.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1089: _scrub_snaps_multi: '[' primary = primary ']' 2026-03-08T23:29:55.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: seq 1 7 2026-03-08T23:29:55.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:55.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap1 2026-03-08T23:29:55.964 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap1 2026-03-08T23:29:55.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:55.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap2 2026-03-08T23:29:56.065 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap2 2026-03-08T23:29:56.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:56.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap3 2026-03-08T23:29:56.170 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap3 2026-03-08T23:29:56.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:56.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap4 2026-03-08T23:29:56.274 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap4 2026-03-08T23:29:56.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:56.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap5 2026-03-08T23:29:56.378 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap5 2026-03-08T23:29:56.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:56.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap6 2026-03-08T23:29:56.482 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap6 2026-03-08T23:29:56.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1091: _scrub_snaps_multi: for i in `seq 1 7` 2026-03-08T23:29:56.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1093: _scrub_snaps_multi: rados -p test rmsnap snap7 2026-03-08T23:29:56.586 INFO:tasks.workunit.client.0.vm03.stdout:removed pool test snap snap7 2026-03-08T23:29:56.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1095: _scrub_snaps_multi: sleep 5 2026-03-08T23:30:01.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1096: _scrub_snaps_multi: local -i loop=0 2026-03-08T23:30:01.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1097: _scrub_snaps_multi: ceph pg dump pgs 2026-03-08T23:30:01.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1097: _scrub_snaps_multi: grep -q snaptrim 2026-03-08T23:30:01.767 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:30:01.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1099: _scrub_snaps_multi: ceph pg dump pgs 2026-03-08T23:30:01.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1099: _scrub_snaps_multi: grep -q snaptrim_error 2026-03-08T23:30:01.947 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:30:01.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1101: _scrub_snaps_multi: break 2026-03-08T23:30:01.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1111: _scrub_snaps_multi: ceph pg dump pgs 2026-03-08T23:30:02.122 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T23:30:02.122 INFO:tasks.workunit.client.0.vm03.stdout:1.0 36 0 0 0 0 24448 0 0 56 0 56 active+clean+inconsistent+snaptrim_error 2026-03-08T23:29:56.595075+0000 22'56 36:137 [1,0] 1 [1,0] 1 22'56 2026-03-08T23:29:49.423015+0000 0'0 2026-03-08T23:28:44.120798+0000 7 1 periodic scrub scheduled @ 2026-03-09T23:56:35.103946+0000 33 0 2026-03-08T23:30:02.122 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T23:30:02.122 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T23:30:02.122 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1113: _scrub_snaps_multi: for pid in $pids 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1115: _scrub_snaps_multi: kill -0 454502 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1113: _scrub_snaps_multi: for pid in $pids 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1115: _scrub_snaps_multi: kill -0 455066 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1122: _scrub_snaps_multi: kill_daemons td/osd-scrub-snaps 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:30:02.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:30:02.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:30:02.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:30:02.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1124: _scrub_snaps_multi: declare -a err_strings 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1125: _scrub_snaps_multi: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj4:7 : missing' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1126: _scrub_snaps_multi: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj3:head : size 3840 != size 768 from auth oi' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1127: _scrub_snaps_multi: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:1 : missing' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1128: _scrub_snaps_multi: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:2 : missing' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1129: _scrub_snaps_multi: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj5:4 : size 4608 != size 512 from auth oi' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1130: _scrub_snaps_multi: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid .*:::obj5:7 : failed to pick suitable object info' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1131: _scrub_snaps_multi: err_strings[6]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj1:head : missing' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1132: _scrub_snaps_multi: err_strings[7]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 30 errors' 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj4:7 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj3:head : size 3840 != size 768 from auth oi' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:1 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:2 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj5:4 : size 4608 != size 512 from auth oi' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid .*:::obj5:7 : failed to pick suitable object info' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj1:head : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:30:07.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 30 errors' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1144: _scrub_snaps_multi: declare -a rep_err_strings 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: eval echo '$primary' 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: echo 1 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: osd=1 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1146: _scrub_snaps_multi: rep_err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1147: _scrub_snaps_multi: for err_string in "${rep_err_strings[@]}" 2026-03-08T23:30:07.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1149: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:30:07.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1156: _scrub_snaps_multi: '[' 0 '!=' 0 ']' 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stdout:TEST PASSED 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1162: _scrub_snaps_multi: echo 'TEST PASSED' 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1163: _scrub_snaps_multi: return 0 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1181: TEST_scrub_snaps_primary: err=0 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1182: TEST_scrub_snaps_primary: CEPH_ARGS='--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1183: TEST_scrub_snaps_primary: return 0 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:40: run: teardown td/osd-scrub-snaps 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:30:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:30:07.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:30:07.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:30:07.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:30:07.470 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:30:07.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:30:07.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:30:07.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:30:07.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:30:07.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:30:07.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:30:07.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:30:07.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:30:07.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:30:07.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:37: run: for func in $funcs 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:38: run: setup td/osd-scrub-snaps 2026-03-08T23:30:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-snaps 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:30:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:30:07.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:30:07.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:30:07.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:30:07.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:30:07.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:30:07.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:30:07.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:30:07.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:30:07.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:30:07.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:30:07.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:30:07.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:30:07.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:30:07.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:30:07.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:30:07.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-snaps 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.420670 2026-03-08T23:30:07.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-snaps 1' TERM HUP INT 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:39: run: TEST_scrub_snaps_replica td/osd-scrub-snaps 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1167: TEST_scrub_snaps_replica: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1168: TEST_scrub_snaps_replica: ORIG_ARGS='--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1169: TEST_scrub_snaps_replica: CEPH_ARGS+=' --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1170: TEST_scrub_snaps_replica: _scrub_snaps_multi td/osd-scrub-snaps replica 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:732: _scrub_snaps_multi: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:733: _scrub_snaps_multi: local poolname=test 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:734: _scrub_snaps_multi: local OBJS=16 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:735: _scrub_snaps_multi: local OSDS=2 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:736: _scrub_snaps_multi: local which=replica 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:738: _scrub_snaps_multi: TESTDATA=testdata.420670 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:740: _scrub_snaps_multi: run_mon td/osd-scrub-snaps a --osd_pool_default_size=2 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-snaps/a 2026-03-08T23:30:07.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-snaps/a --run-dir=td/osd-scrub-snaps --osd_pool_default_size=2 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:07.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-snaps/a '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-snaps/log --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:30:07.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:30:07.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:30:07.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:30:07.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:30:07.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:30:07.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:30:07.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:30:07.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:30:07.843 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:30:07.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get fsid 2026-03-08T23:30:07.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:30:07.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:30:07.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:30:07.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:30:07.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:07.917 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:07.918 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.420670/ceph-mon.a.asok 2026-03-08T23:30:07.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:30:07.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.420670/ceph-mon.a.asok config get mon_host 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:741: _scrub_snaps_multi: run_mgr td/osd-scrub-snaps x 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-snaps 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-snaps/x 2026-03-08T23:30:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:08.092 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:08.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:08.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:30:08.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-snaps/x '--log-file=td/osd-scrub-snaps/$name.log' '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --run-dir=td/osd-scrub-snaps '--pid-file=td/osd-scrub-snaps/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:30:08.113 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: expr 2 - 1 2026-03-08T23:30:08.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: seq 0 1 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:744: _scrub_snaps_multi: run_osd td/osd-scrub-snaps 0 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:30:08.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:30:08.122 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:30:08.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=3f95fae2-5dbe-4f2d-a42a-959f9e99343c 2026-03-08T23:30:08.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 3f95fae2-5dbe-4f2d-a42a-959f9e99343c' 2026-03-08T23:30:08.123 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 3f95fae2-5dbe-4f2d-a42a-959f9e99343c 2026-03-08T23:30:08.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:30:08.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCABq5pPlenCBAA4LDBjd1dNu5iJmNp/ooKtQ== 2026-03-08T23:30:08.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCABq5pPlenCBAA4LDBjd1dNu5iJmNp/ooKtQ=="}' 2026-03-08T23:30:08.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 3f95fae2-5dbe-4f2d-a42a-959f9e99343c -i td/osd-scrub-snaps/0/new.json 2026-03-08T23:30:08.245 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:30:08.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-snaps/0/new.json 2026-03-08T23:30:08.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCABq5pPlenCBAA4LDBjd1dNu5iJmNp/ooKtQ== --osd-uuid 3f95fae2-5dbe-4f2d-a42a-959f9e99343c 2026-03-08T23:30:08.278 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:08.281+0000 7fc8f48038c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:08.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:08.289+0000 7fc8f48038c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:08.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:08.289+0000 7fc8f48038c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:08.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:08.289+0000 7fc8f48038c0 -1 bdev(0x561efccd2c00 td/osd-scrub-snaps/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:30:08.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:08.289+0000 7fc8f48038c0 -1 bluestore(td/osd-scrub-snaps/0) _read_fsid unparsable uuid 2026-03-08T23:30:10.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-snaps/0/keyring 2026-03-08T23:30:10.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:30:10.541 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:30:10.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:30:10.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-snaps/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:30:10.680 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:30:10.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:30:10.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:30:10.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:30:10.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:30:10.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:30:10.707 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:10.705+0000 7f9d86e1f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:10.708 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:10.713+0000 7f9d86e1f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:10.710 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:10.713+0000 7f9d86e1f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:10.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:30:10.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:10.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:30:11.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:30:11.417 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:11.421+0000 7f9d86e1f8c0 -1 Falling back to public interface 2026-03-08T23:30:12.033 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:30:12.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:30:12.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:12.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:30:12.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:12.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:30:12.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:30:12.397 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:12.401+0000 7f9d86e1f8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:13.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:30:13.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:14.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1454033845,v1:127.0.0.1:6803/1454033845] [v2:127.0.0.1:6804/1454033845,v1:127.0.0.1:6805/1454033845] exists,up 3f95fae2-5dbe-4f2d-a42a-959f9e99343c 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:742: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:744: _scrub_snaps_multi: run_osd td/osd-scrub-snaps 1 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:30:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-snaps/1 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/1' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/1/journal' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:30:14.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-snaps/1 2026-03-08T23:30:14.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:30:14.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fd288a5c-4c8d-4565-8f83-c31f55b487cb 2026-03-08T23:30:14.558 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 fd288a5c-4c8d-4565-8f83-c31f55b487cb 2026-03-08T23:30:14.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 fd288a5c-4c8d-4565-8f83-c31f55b487cb' 2026-03-08T23:30:14.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:30:14.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCGBq5p/SKoIhAAyJtGtMEv4VXry+wPyLw10w== 2026-03-08T23:30:14.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCGBq5p/SKoIhAAyJtGtMEv4VXry+wPyLw10w=="}' 2026-03-08T23:30:14.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fd288a5c-4c8d-4565-8f83-c31f55b487cb -i td/osd-scrub-snaps/1/new.json 2026-03-08T23:30:14.732 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:30:14.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-snaps/1/new.json 2026-03-08T23:30:14.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCGBq5p/SKoIhAAyJtGtMEv4VXry+wPyLw10w== --osd-uuid fd288a5c-4c8d-4565-8f83-c31f55b487cb 2026-03-08T23:30:14.766 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:14.769+0000 7fc440a7e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:14.768 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:14.773+0000 7fc440a7e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:14.769 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:14.773+0000 7fc440a7e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:14.769 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:14.773+0000 7fc440a7e8c0 -1 bdev(0x555dc073dc00 td/osd-scrub-snaps/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:30:14.769 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:14.773+0000 7fc440a7e8c0 -1 bluestore(td/osd-scrub-snaps/1) _read_fsid unparsable uuid 2026-03-08T23:30:17.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-snaps/1/keyring 2026-03-08T23:30:17.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:30:17.016 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:30:17.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:30:17.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-snaps/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:30:17.215 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:30:17.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:30:17.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:30:17.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:30:17.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:30:17.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:30:17.235 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:17.237+0000 7f277283f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:17.240 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:17.245+0000 7f277283f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:17.242 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:17.245+0000 7f277283f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:30:17.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:30:17.684 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:17.689+0000 7f277283f8c0 -1 Falling back to public interface 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:18.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:30:18.652 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:30:18.657+0000 7f277283f8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:30:18.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:30:19.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:30:19.911 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1888201638,v1:127.0.0.1:6811/1888201638] [v2:127.0.0.1:6812/1888201638,v1:127.0.0.1:6813/1888201638] exists,up fd288a5c-4c8d-4565-8f83-c31f55b487cb 2026-03-08T23:30:19.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:30:19.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:30:19.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:30:19.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:748: _scrub_snaps_multi: ceph osd set noscrub 2026-03-08T23:30:20.087 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:30:20.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:749: _scrub_snaps_multi: ceph osd set nodeep-scrub 2026-03-08T23:30:20.289 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:30:20.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:752: _scrub_snaps_multi: create_pool test 1 1 2026-03-08T23:30:20.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:30:20.494 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:30:20.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:753: _scrub_snaps_multi: wait_for_clean 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:30:21.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:30:21.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:30:21.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:30:21.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:30:21.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:30:21.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:30:21.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:30:21.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:30:21.728 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:30:21.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:30:21.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:30:21.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:30:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836483 2026-03-08T23:30:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836483 2026-03-08T23:30:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483' 2026-03-08T23:30:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:30:21.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:30:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672962 2026-03-08T23:30:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672962 2026-03-08T23:30:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836483 1-42949672962' 2026-03-08T23:30:21.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:30:21.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836483 2026-03-08T23:30:21.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:30:21.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:30:21.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836483 2026-03-08T23:30:21.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:30:21.893 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836483 2026-03-08T23:30:21.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836483 2026-03-08T23:30:21.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836483' 2026-03-08T23:30:21.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:30:22.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836482 -lt 21474836483 2026-03-08T23:30:22.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:30:23.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:30:23.052 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:30:23.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836483 -lt 21474836483 2026-03-08T23:30:23.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:30:23.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672962 2026-03-08T23:30:23.213 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:30:23.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:30:23.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672962 2026-03-08T23:30:23.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:30:23.216 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672962 2026-03-08T23:30:23.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672962 2026-03-08T23:30:23.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672962' 2026-03-08T23:30:23.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:30:23.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672962 -lt 42949672962 2026-03-08T23:30:23.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:30:23.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:30:23.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:30:23.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:30:23.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:30:23.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:30:23.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:30:23.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:30:23.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:30:23.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:30:23.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:30:23.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: ceph osd dump 2026-03-08T23:30:23.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: awk '{ print $2 }' 2026-03-08T23:30:23.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: grep '^pool.*['\'']test['\'']' 2026-03-08T23:30:24.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:754: _scrub_snaps_multi: poolid=1 2026-03-08T23:30:24.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:756: _scrub_snaps_multi: dd if=/dev/urandom of=testdata.420670 bs=1032 count=1 2026-03-08T23:30:24.087 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:30:24.087 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:30:24.087 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 6.38e-05 s, 16.2 MB/s 2026-03-08T23:30:24.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: seq 1 16 2026-03-08T23:30:24.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj1 testdata.420670 2026-03-08T23:30:24.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj2 testdata.420670 2026-03-08T23:30:24.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj3 testdata.420670 2026-03-08T23:30:24.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj4 testdata.420670 2026-03-08T23:30:24.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj5 testdata.420670 2026-03-08T23:30:24.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj6 testdata.420670 2026-03-08T23:30:24.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj7 testdata.420670 2026-03-08T23:30:24.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj8 testdata.420670 2026-03-08T23:30:24.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj9 testdata.420670 2026-03-08T23:30:24.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj10 testdata.420670 2026-03-08T23:30:24.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj11 testdata.420670 2026-03-08T23:30:24.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj12 testdata.420670 2026-03-08T23:30:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj13 testdata.420670 2026-03-08T23:30:24.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj14 testdata.420670 2026-03-08T23:30:24.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj15 testdata.420670 2026-03-08T23:30:24.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:757: _scrub_snaps_multi: for i in `seq 1 $OBJS` 2026-03-08T23:30:24.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:759: _scrub_snaps_multi: rados -p test put obj16 testdata.420670 2026-03-08T23:30:24.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:762: _scrub_snaps_multi: get_primary test obj1 2026-03-08T23:30:24.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:30:24.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:30:24.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:30:24.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:30:24.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:762: _scrub_snaps_multi: local primary=1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:763: _scrub_snaps_multi: get_not_primary test obj1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=test 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary test obj1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:30:24.622 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:30:24.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:30:24.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map test obj1 2026-03-08T23:30:24.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:763: _scrub_snaps_multi: local replica=0 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:765: _scrub_snaps_multi: eval create_scenario td/osd-scrub-snaps test testdata.420670 '$replica' 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:765: _scrub_snaps_multi: create_scenario td/osd-scrub-snaps test testdata.420670 0 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:45: create_scenario: local dir=td/osd-scrub-snaps 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:46: create_scenario: local poolname=test 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:47: create_scenario: local TESTDATA=testdata.420670 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:48: create_scenario: local osd=0 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:50: create_scenario: SNAP=1 2026-03-08T23:30:24.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:51: create_scenario: rados -p test mksnap snap1 2026-03-08T23:30:25.023 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap1 2026-03-08T23:30:25.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:52: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=1 2026-03-08T23:30:25.026 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:30:25.026 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:30:25.026 INFO:tasks.workunit.client.0.vm03.stderr:256 bytes copied, 8.5119e-05 s, 3.0 MB/s 2026-03-08T23:30:25.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:53: create_scenario: rados -p test put obj1 testdata.420670 2026-03-08T23:30:25.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:54: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:30:25.072 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:55: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:30:25.097 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: seq 6 14 2026-03-08T23:30:25.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj6 testdata.420670 2026-03-08T23:30:25.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.121 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj7 testdata.420670 2026-03-08T23:30:25.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj8 testdata.420670 2026-03-08T23:30:25.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj9 testdata.420670 2026-03-08T23:30:25.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj10 testdata.420670 2026-03-08T23:30:25.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj11 testdata.420670 2026-03-08T23:30:25.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj12 testdata.420670 2026-03-08T23:30:25.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.266 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj13 testdata.420670 2026-03-08T23:30:25.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:56: create_scenario: for i in `seq 6 14` 2026-03-08T23:30:25.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:57: create_scenario: rados -p test put obj14 testdata.420670 2026-03-08T23:30:25.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:60: create_scenario: SNAP=2 2026-03-08T23:30:25.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:61: create_scenario: rados -p test mksnap snap2 2026-03-08T23:30:25.386 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap2 2026-03-08T23:30:25.388 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:62: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=2 2026-03-08T23:30:25.389 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records in 2026-03-08T23:30:25.389 INFO:tasks.workunit.client.0.vm03.stderr:2+0 records out 2026-03-08T23:30:25.389 INFO:tasks.workunit.client.0.vm03.stderr:512 bytes copied, 7.3768e-05 s, 6.9 MB/s 2026-03-08T23:30:25.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:63: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:30:25.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:65: create_scenario: SNAP=3 2026-03-08T23:30:25.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:66: create_scenario: rados -p test mksnap snap3 2026-03-08T23:30:25.488 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap3 2026-03-08T23:30:25.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:67: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=3 2026-03-08T23:30:25.491 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records in 2026-03-08T23:30:25.491 INFO:tasks.workunit.client.0.vm03.stderr:3+0 records out 2026-03-08T23:30:25.491 INFO:tasks.workunit.client.0.vm03.stderr:768 bytes copied, 8.7544e-05 s, 8.8 MB/s 2026-03-08T23:30:25.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:68: create_scenario: rados -p test put obj3 testdata.420670 2026-03-08T23:30:25.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:70: create_scenario: SNAP=4 2026-03-08T23:30:25.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:71: create_scenario: rados -p test mksnap snap4 2026-03-08T23:30:25.592 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap4 2026-03-08T23:30:25.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:72: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=4 2026-03-08T23:30:25.595 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records in 2026-03-08T23:30:25.595 INFO:tasks.workunit.client.0.vm03.stderr:4+0 records out 2026-03-08T23:30:25.595 INFO:tasks.workunit.client.0.vm03.stderr:1024 bytes (1.0 kB, 1.0 KiB) copied, 8.568e-05 s, 12.0 MB/s 2026-03-08T23:30:25.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:73: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:30:25.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:74: create_scenario: rados -p test put obj2 testdata.420670 2026-03-08T23:30:25.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:76: create_scenario: SNAP=5 2026-03-08T23:30:25.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:77: create_scenario: rados -p test mksnap snap5 2026-03-08T23:30:25.696 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap5 2026-03-08T23:30:25.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:78: create_scenario: SNAP=6 2026-03-08T23:30:25.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:79: create_scenario: rados -p test mksnap snap6 2026-03-08T23:30:25.801 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap6 2026-03-08T23:30:25.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:80: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=6 2026-03-08T23:30:25.806 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records in 2026-03-08T23:30:25.806 INFO:tasks.workunit.client.0.vm03.stderr:6+0 records out 2026-03-08T23:30:25.806 INFO:tasks.workunit.client.0.vm03.stderr:1536 bytes (1.5 kB, 1.5 KiB) copied, 9.634e-05 s, 15.9 MB/s 2026-03-08T23:30:25.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:81: create_scenario: rados -p test put obj5 testdata.420670 2026-03-08T23:30:25.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:83: create_scenario: SNAP=7 2026-03-08T23:30:25.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:84: create_scenario: rados -p test mksnap snap7 2026-03-08T23:30:25.905 INFO:tasks.workunit.client.0.vm03.stdout:created pool test snap snap7 2026-03-08T23:30:25.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:86: create_scenario: rados -p test rm obj4 2026-03-08T23:30:25.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:87: create_scenario: rados -p test rm obj16 2026-03-08T23:30:25.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:88: create_scenario: rados -p test rm obj2 2026-03-08T23:30:25.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:90: create_scenario: kill_daemons td/osd-scrub-snaps TERM osd 2026-03-08T23:30:25.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:30:25.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:30:25.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:30:25.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:30:25.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:30:26.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:30:26.089 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj1 2026-03-08T23:30:26.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:94: create_scenario: JSON='["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:26.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:95: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj1","key":"","snapid":-2,"hash":1828249343,"max":0,"pool":1,"namespace":"","max":0}]' --force remove 2026-03-08T23:30:27.358 INFO:tasks.workunit.client.0.vm03.stdout:WARNING: only removing head with clones present 2026-03-08T23:30:27.358 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:ff7b1f36:::obj1:head# 2026-03-08T23:30:27.891 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: grep '"snapid":2' 2026-03-08T23:30:27.891 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:30:28.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:97: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:28.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:98: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":2,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:30:29.377 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:2# 2026-03-08T23:30:29.911 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:30:29.912 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: grep '"snapid":1' 2026-03-08T23:30:30.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:100: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:30.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:101: create_scenario: OBJ5SAVE='["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:30.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:103: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:104: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:31.847 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:105: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:31.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:106: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --rmtype nosnapmap '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:30:32.483 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:c52c9666:::obj5:1# 2026-03-08T23:30:33.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:108: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:109: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:30:34.095 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.0779578A.7.obj4.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:110: create_scenario: grep '^[pm].*SNA_.*[.]1[.]obj5[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:34.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:111: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:30:34.097 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj5 2026-03-08T23:30:34.097 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: grep '"snapid":4' 2026-03-08T23:30:34.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:113: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:34.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:114: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=18 2026-03-08T23:30:34.943 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records in 2026-03-08T23:30:34.943 INFO:tasks.workunit.client.0.vm03.stderr:18+0 records out 2026-03-08T23:30:34.943 INFO:tasks.workunit.client.0.vm03.stderr:4608 bytes (4.6 kB, 4.5 KiB) copied, 8.9327e-05 s, 51.6 MB/s 2026-03-08T23:30:34.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:115: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":4,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:30:36.111 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj3 2026-03-08T23:30:36.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:117: create_scenario: JSON='["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:36.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:118: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=15 2026-03-08T23:30:36.952 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records in 2026-03-08T23:30:36.952 INFO:tasks.workunit.client.0.vm03.stderr:15+0 records out 2026-03-08T23:30:36.952 INFO:tasks.workunit.client.0.vm03.stderr:3840 bytes (3.8 kB, 3.8 KiB) copied, 9.7943e-05 s, 39.2 MB/s 2026-03-08T23:30:36.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:119: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj3","key":"","snapid":-2,"hash":1643547569,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:30:38.120 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj4 2026-03-08T23:30:38.120 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: grep '"snapid":7' 2026-03-08T23:30:38.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:121: create_scenario: JSON='["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:38.967 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:122: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj4","key":"","snapid":7,"hash":2826278768,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:30:39.597 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:0ee9ae15:::obj4:7# 2026-03-08T23:30:40.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:125: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:126: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:41.215 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:127: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.268F1DA7.7.obj16.. 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --op list obj16 2026-03-08T23:30:41.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: grep '"snapid":7' 2026-03-08T23:30:42.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:128: create_scenario: JSON='["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:42.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:129: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --rmtype snapmap '["1.0",{"oid":"obj16","key":"","snapid":7,"hash":2060580962,"max":0,"pool":1,"namespace":"","max":0}]' remove 2026-03-08T23:30:42.709 INFO:tasks.workunit.client.0.vm03.stdout:remove #1:461f8b5e:::obj16:7# 2026-03-08T23:30:43.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:131: create_scenario: ceph-kvstore-tool bluestore-kv td/osd-scrub-snaps/0 list 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:132: create_scenario: grep SNA_ td/osd-scrub-snaps/drk.log 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.1BB86F16.1.obj3.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.3A439666.1.obj5.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.4CC52438.1.obj6.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5685B6C7.1.obj13.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5AAE7FD8.1.obj8.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.5FAF9A3D.1.obj12.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.70522149.1.obj14.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.77574592.1.obj11.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.950988C5.1.obj7.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.B8CCECA2.1.obj10.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.F7CA874E.1.obj9.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000001_0000000000000001.FFED8FC6.1.obj1.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000002_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.1BB86F16.3.obj3.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000003_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.3A439666.4.obj5.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000004_0000000000000001.802EE1F3.4.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000005_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.3A439666.6.obj5.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000006_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:44.323 INFO:tasks.workunit.client.0.vm03.stdout:p %00%00%00%00%00%00%00%00%c07%16%25%00%00%00%00%00%00%04%02.SNA_1_0000000000000007_0000000000000001.802EE1F3.7.obj2.. 2026-03-08T23:30:44.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:133: create_scenario: grep '^[pm].*SNA_.*[.]7[.]obj16[.][.]$' td/osd-scrub-snaps/drk.log 2026-03-08T23:30:44.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:134: create_scenario: rm -f td/osd-scrub-snaps/drk.log 2026-03-08T23:30:44.325 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj2 2026-03-08T23:30:45.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:136: create_scenario: JSON='["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:45.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:137: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj2","key":"","snapid":-2,"hash":1058988552,"max":0,"pool":1,"namespace":"","max":0}]' rm-attr snapset 2026-03-08T23:30:46.799 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: echo '["1.0",{"oid":"obj5","key":"","snapid":1,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:46.799 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: sed 's/snapid":1/snapid":7/' 2026-03-08T23:30:46.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:140: create_scenario: JSON='["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:46.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:141: create_scenario: dd if=/dev/urandom of=testdata.420670 bs=256 count=7 2026-03-08T23:30:46.801 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records in 2026-03-08T23:30:46.801 INFO:tasks.workunit.client.0.vm03.stderr:7+0 records out 2026-03-08T23:30:46.801 INFO:tasks.workunit.client.0.vm03.stderr:1792 bytes (1.8 kB, 1.8 KiB) copied, 7.6453e-05 s, 23.4 MB/s 2026-03-08T23:30:46.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:142: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj5","key":"","snapid":7,"hash":1718170787,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes testdata.420670 2026-03-08T23:30:47.971 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj6 2026-03-08T23:30:48.283 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:48.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:144: create_scenario: JSON='["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:48.811 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:145: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj6","key":"","snapid":-2,"hash":2202164420,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset 2026-03-08T23:30:49.979 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj7 2026-03-08T23:30:50.297 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:50.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:146: create_scenario: JSON='["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:50.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:147: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj7","key":"","snapid":-2,"hash":1552453721,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset corrupt 2026-03-08T23:30:51.996 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj8 2026-03-08T23:30:52.307 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:52.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:148: create_scenario: JSON='["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:52.848 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:149: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj8","key":"","snapid":-2,"hash":2381834917,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset seq 2026-03-08T23:30:54.019 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj9 2026-03-08T23:30:54.328 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:54.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:150: create_scenario: JSON='["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:54.863 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:151: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj9","key":"","snapid":-2,"hash":3833113727,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_size 2026-03-08T23:30:56.160 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj10 2026-03-08T23:30:56.479 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:57.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:152: create_scenario: JSON='["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:57.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:153: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj10","key":"","snapid":-2,"hash":718195851,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clone_overlap 2026-03-08T23:30:58.183 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj11 2026-03-08T23:30:58.493 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:30:59.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:154: create_scenario: JSON='["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:30:59.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:155: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj11","key":"","snapid":-2,"hash":693400951,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset clones 2026-03-08T23:31:00.199 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj12 2026-03-08T23:31:00.517 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:31:01.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:156: create_scenario: JSON='["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:31:01.050 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:157: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj12","key":"","snapid":-2,"hash":3551132405,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset head 2026-03-08T23:31:02.219 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj13 2026-03-08T23:31:02.527 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:31:03.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:158: create_scenario: JSON='["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:31:03.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:159: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj13","key":"","snapid":-2,"hash":2087409765,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset snaps 2026-03-08T23:31:04.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj14 2026-03-08T23:31:04.541 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:31:05.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:160: create_scenario: JSON='["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:31:05.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:161: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj14","key":"","snapid":-2,"hash":2484217095,"max":0,"pool":1,"namespace":"","max":0}]' clear-snapset size 2026-03-08T23:31:06.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:163: create_scenario: echo garbage 2026-03-08T23:31:06.248 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 --head --op list obj15 2026-03-08T23:31:06.564 INFO:tasks.workunit.client.0.vm03.stderr:Error getting attr on : 1.0_head,#1:c52c9666:::obj5:7#, (61) No data available 2026-03-08T23:31:07.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:164: create_scenario: JSON='["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:31:07.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:165: create_scenario: ceph-objectstore-tool --data-path td/osd-scrub-snaps/0 '["1.0",{"oid":"obj15","key":"","snapid":-2,"hash":612772309,"max":0,"pool":1,"namespace":"","max":0}]' set-attr snapset td/osd-scrub-snaps/bad 2026-03-08T23:31:08.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:166: create_scenario: rm -f td/osd-scrub-snaps/bad 2026-03-08T23:31:08.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:167: create_scenario: return 0 2026-03-08T23:31:08.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:767: _scrub_snaps_multi: rm -f testdata.420670 2026-03-08T23:31:08.257 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: expr 2 - 1 2026-03-08T23:31:08.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: seq 0 1 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:771: _scrub_snaps_multi: activate_osd td/osd-scrub-snaps 0 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=0 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-snaps/0 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/0' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/0/journal' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:08.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:08.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:08.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:08.260 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:31:08.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:31:08.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-snaps/0 2026-03-08T23:31:08.262 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:31:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.0 2026-03-08T23:31:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 0 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/0 --osd-journal=td/osd-scrub-snaps/0/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:31:08.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-snaps/0/whoami 2026-03-08T23:31:08.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 0 = 0 ']' 2026-03-08T23:31:08.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:31:08.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:31:08.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:31:08.285 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:08.285+0000 7f7a408348c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:08.296 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:08.301+0000 7f7a408348c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:08.307 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:08.301+0000 7f7a408348c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 0 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:08.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:31:08.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:08.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:08.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:09.253 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:09.257+0000 7f7a408348c0 -1 Falling back to public interface 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:09.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:09.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:10.226 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:10.229+0000 7f7a408348c0 -1 osd.0 23 log_to_monitors true 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:10.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:11.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:12.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 26 up_thru 26 down_at 24 last_clean_interval [5,23) [v2:127.0.0.1:6802/2123180490,v1:127.0.0.1:6803/2123180490] [v2:127.0.0.1:6804/2123180490,v1:127.0.0.1:6805/2123180490] exists,up 3f95fae2-5dbe-4f2d-a42a-959f9e99343c 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:769: _scrub_snaps_multi: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:771: _scrub_snaps_multi: activate_osd td/osd-scrub-snaps 1 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-snaps 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-snaps/1 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-snaps/1' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-snaps/1/journal' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-snaps' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:31:12.184 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-snaps/$name.log' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-snaps/$name.pid' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:31:12.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-snaps/1 2026-03-08T23:31:12.186 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:31:12.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:31:12.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 --osd_scrub_chunk_min=3 --osd_scrub_chunk_max=20 --osd_shallow_scrub_chunk_min=3 --osd_shallow_scrub_chunk_max=3 --osd_pg_stat_report_interval_max_seconds=1 --osd_pg_stat_report_interval_max_epochs=1 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-snaps/1 --osd-journal=td/osd-scrub-snaps/1/journal --chdir= --run-dir=td/osd-scrub-snaps '--admin-socket=/tmp/ceph-asok.420670/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-snaps/$name.log' '--pid-file=td/osd-scrub-snaps/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:31:12.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-snaps/1/whoami 2026-03-08T23:31:12.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:31:12.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:31:12.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:31:12.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:31:12.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:12.209+0000 7f37e7c988c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:12.212 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:12.217+0000 7f37e7c988c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:12.213 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:12.217+0000 7f37e7c988c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:12.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:12.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:13.425 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:13.429+0000 7f37e7c988c0 -1 Falling back to public interface 2026-03-08T23:31:13.546 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:31:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:13.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:14.684 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:14.689+0000 7f37e7c988c0 -1 osd.1 23 log_to_monitors true 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:14.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:15.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:16.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:17.239 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:31:17.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:17.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:17.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:31:17.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:17.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:17.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:17.567 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:17.569+0000 7f37dec48640 -1 osd.1 23 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 5 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:18.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:18.586 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 29 up_thru 29 down_at 24 last_clean_interval [10,23) [v2:127.0.0.1:6810/735565453,v1:127.0.0.1:6811/735565453] [v2:127.0.0.1:6812/735565453,v1:127.0.0.1:6813/735565453] exists,up fd288a5c-4c8d-4565-8f83-c31f55b487cb 2026-03-08T23:31:18.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:31:18.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:31:18.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:31:18.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:774: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_max 3 2026-03-08T23:31:18.659 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:31:18.659 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:31:18.659 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.666 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:31:18.667 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_sleep = '' osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:31:18.667 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:775: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_min 3 2026-03-08T23:31:18.759 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:31:18.759 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:31:18.759 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.766 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:31:18.766 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:31:18.766 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:776: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_scrub_chunk_min 3 2026-03-08T23:31:18.844 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:31:18.844 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:31:18.844 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.852 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:31:18.852 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_chunk_min = '' (not observed, change may require restart) " 2026-03-08T23:31:18.852 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:777: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_seconds 1 2026-03-08T23:31:18.928 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:31:18.928 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_seconds = '' (not observed, change may require restart) " 2026-03-08T23:31:18.928 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.935 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:31:18.935 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_seconds = '' (not observed, change may require restart) " 2026-03-08T23:31:18.935 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:18.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:778: _scrub_snaps_multi: ceph tell 'osd.*' config set osd_pg_stat_report_interval_max_epochs 1 2026-03-08T23:31:19.017 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:31:19.017 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_epochs = '' (not observed, change may require restart) " 2026-03-08T23:31:19.017 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:19.023 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:31:19.023 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_pg_stat_report_interval_max_epochs = '' (not observed, change may require restart) " 2026-03-08T23:31:19.023 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:779: _scrub_snaps_multi: wait_for_clean 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:31:19.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:31:19.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:31:19.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:31:19.263 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:31:19.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:31:19.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:19.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:31:19.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149699 2026-03-08T23:31:19.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149699 2026-03-08T23:31:19.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149699' 2026-03-08T23:31:19.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:19.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:31:19.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051586 2026-03-08T23:31:19.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051586 2026-03-08T23:31:19.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149699 1-124554051586' 2026-03-08T23:31:19.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:19.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149699 2026-03-08T23:31:19.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:19.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:31:19.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149699 2026-03-08T23:31:19.434 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:19.435 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149699 2026-03-08T23:31:19.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149699 2026-03-08T23:31:19.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149699' 2026-03-08T23:31:19.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:19.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149698 -lt 111669149699 2026-03-08T23:31:19.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:31:20.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:31:20.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149700 -lt 111669149699 2026-03-08T23:31:20.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:20.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-124554051586 2026-03-08T23:31:20.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:20.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:31:20.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-124554051586 2026-03-08T23:31:20.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:20.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051586 2026-03-08T23:31:20.789 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 124554051586 2026-03-08T23:31:20.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 124554051586' 2026-03-08T23:31:20.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:31:20.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051587 -lt 124554051586 2026-03-08T23:31:20.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:31:20.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:31:20.972 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:31:21.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:31:21.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:31:21.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:31:21.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:31:21.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:31:21.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:31:21.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:781: _scrub_snaps_multi: local pgid=1.0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:782: _scrub_snaps_multi: pg_scrub 1.0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=1.0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 1.0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:31:21.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:31:21.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:31:21.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:31:21.918 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:31:21.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:31:21.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:21.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:31:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=111669149701 2026-03-08T23:31:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 111669149701 2026-03-08T23:31:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149701' 2026-03-08T23:31:22.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:22.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=124554051588 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 124554051588 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-111669149701 1-124554051588' 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-111669149701 2026-03-08T23:31:22.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:22.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-111669149701 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=111669149701 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 111669149701 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 111669149701' 2026-03-08T23:31:22.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:22.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149700 -lt 111669149701 2026-03-08T23:31:22.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:31:23.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:31:23.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:23.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 111669149701 -lt 111669149701 2026-03-08T23:31:23.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:23.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-124554051588 2026-03-08T23:31:23.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:23.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:31:23.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-124554051588 2026-03-08T23:31:23.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:23.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=124554051588 2026-03-08T23:31:23.470 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 124554051588 2026-03-08T23:31:23.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 124554051588' 2026-03-08T23:31:23.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 124554051588 -lt 124554051588 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:31:23.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 1.0 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:31:23.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:31:23.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:31:23.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:23.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 1.0 2026-03-08T23:31:24.074 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to scrub 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 1.0 2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:31:24.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:31:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:30:20.499761+0000 '>' 2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:24.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:31:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:31:25.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:30:20.499761+0000 '>' 2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:25.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:31:26.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 300 )) 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:31:26.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:31:26.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:31:24.875268+0000 '>' 2026-03-08T23:30:20.499761+0000 2026-03-08T23:31:26.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:31:26.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: grep '_scan_snaps start' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:26.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: wc -l 2026-03-08T23:31:26.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:786: _scrub_snaps_multi: test 16 -gt 3 2026-03-08T23:31:26.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: grep '_scan_snaps start' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:31:26.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: wc -l 2026-03-08T23:31:26.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:787: _scrub_snaps_multi: test 16 -gt 3 2026-03-08T23:31:26.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:789: _scrub_snaps_multi: rados list-inconsistent-pg test 2026-03-08T23:31:26.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:791: _scrub_snaps_multi: jq '. | length' td/osd-scrub-snaps/json 2026-03-08T23:31:26.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:791: _scrub_snaps_multi: test 1 = 1 2026-03-08T23:31:26.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:793: _scrub_snaps_multi: jq -r '.[0]' td/osd-scrub-snaps/json 2026-03-08T23:31:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:793: _scrub_snaps_multi: test 1.0 = 1.0 2026-03-08T23:31:26.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:795: _scrub_snaps_multi: rados list-inconsistent-obj 1.0 --format=json-pretty 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "epoch": 29, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "inconsistents": [ 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "version": 5 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 1, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'19", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'5", 2026-03-08T23:31:26.679 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4179.0:1", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 5, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:24.210961+0000", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:24.211586+0000", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x5a6f735d", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 2, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "version": 20 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 2, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "version": "18'41", 2026-03-08T23:31:26.680 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "17'20", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4227.0:1", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 20, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.077529+0000", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.078073+0000", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 4, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "version": 42 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch" 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 4, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.681 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20'45", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "18'42", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4263.0:1", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 42, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.416989+0000", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.417538+0000", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4b62a04a", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "size": 4608, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "object_info": { 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj5", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 4, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1718170787, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "version": "20'45", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "18'42", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4263.0:1", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 42, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.416989+0000", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.417538+0000", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4b62a04a", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.682 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "size": 512 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj4", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "version": 4 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj4", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": 7, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2826278768, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "version": "23'51", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'4", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4176.0:1", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 4, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:24.187491+0000", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:24.188189+0000", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.683 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x5a6f735d", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj5", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "version": 0 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "missing", 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "info_missing" 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "info_missing" 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1792 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.684 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj1", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "version": 18 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj1", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1828249343, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'18", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'1", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4224.0:1", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 18, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.053933+0000", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.056004+0000", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "missing" 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.685 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj10", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "version": 32 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj10", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 718195851, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'32", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'10", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4245.0:1", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 32, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.221768+0000", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.222469+0000", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.686 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "????", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj11", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "version": 34 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj11", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 693400951, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'34", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'11", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4248.0:1", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 34, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.245652+0000", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.246273+0000", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.687 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj13", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "version": 38 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.688 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj13", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2087409765, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'38", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'13", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4254.0:1", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 38, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.294721+0000", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.295349+0000", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]" 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.689 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj14", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "version": 40 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj14", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2484217095, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'40", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'14", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4257.0:1", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 40, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.318748+0000", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.319354+0000", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.690 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1033, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.691 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj15", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "version": 15 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_corrupted" 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj15", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 612772309, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "version": "16'15", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "0'0", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4209.0:1", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 15, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:24.438517+0000", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:24.439124+0000", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x5a6f735d", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.692 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_corrupted" 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": "Z2FyYmFnZQo=" 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj2", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "version": 56 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_missing" 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj2", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1058988552, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "version": "23'56", 2026-03-08T23:31:26.693 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "20'48", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4299.0:1", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 56, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.987936+0000", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.988446+0000", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "whiteout", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "dirty" 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0xffffffff", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_missing" 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "size": 0, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 7, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 4, 2026-03-08T23:31:26.694 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 4, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 3, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 2, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 7, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1024, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 7, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 6, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: 5 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj3", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "version": 44 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch" 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [ 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj3", 2026-03-08T23:31:26.695 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1643547569, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19'44", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "17'22", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4269.0:1", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 44, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.519799+0000", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.520393+0000", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0xe389d906", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "size_mismatch_info", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "obj_size_info_mismatch" 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "size": 3840, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "object_info": { 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj3", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1643547569, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19'44", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "17'22", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4269.0:1", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 44, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.519799+0000", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.520393+0000", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.696 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0xe389d906", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "size": 768 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj6", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "version": 24 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj6", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2202164420, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'24", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'6", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4233.0:1", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 24, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.125879+0000", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.126470+0000", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.697 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj7", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "version": 26 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj7", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 1552453721, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'26", 2026-03-08T23:31:26.698 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'7", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4236.0:1", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 26, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.150589+0000", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.151240+0000", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [] 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj8", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: "version": 28 2026-03-08T23:31:26.699 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj8", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 2381834917, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'28", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'8", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4239.0:1", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 28, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.173886+0000", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.174473+0000", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 0, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.700 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "object": { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "name": "obj9", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "nspace": "", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "locator": "", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snap": "head", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "version": 30 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [ 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snapset_inconsistency" 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "union_shard_errors": [], 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "selected_object_info": { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "oid": { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "oid": "obj9", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "key": "", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "snapid": -2, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "hash": 3833113727, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "max": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "pool": 1, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "namespace": "" 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "version": "17'30", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "prior_version": "16'9", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "last_reqid": "client.4242.0:1", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "user_version": 30, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "mtime": "2026-03-08T23:30:25.197888+0000", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "local_mtime": "2026-03-08T23:30:25.198503+0000", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "lost": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "flags": [ 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "dirty", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest" 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_seq": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "truncate_size": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "data_digest": "0x4ac1078e", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "omap_digest": "0xffffffff", 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "expected_object_size": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "expected_write_size": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "alloc_hint_flags": 0, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "manifest": { 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "type": 0 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: "watchers": {} 2026-03-08T23:31:26.701 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "shards": [ 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 0, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "primary": false, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "size": "????", 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "osd": 1, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "primary": true, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "errors": [], 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "size": 256, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snapset": { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "seq": 1, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "clones": [ 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snap": 1, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "size": 1032, 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "overlap": "[]", 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: "snaps": [ 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: 1 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:31:26.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:797: _scrub_snaps_multi: rados list-inconsistent-snapset 1.0 2026-03-08T23:31:26.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:800: _scrub_snaps_multi: '[' replica = replica ']' 2026-03-08T23:31:26.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:802: _scrub_snaps_multi: scruberrors=20 2026-03-08T23:31:26.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:803: _scrub_snaps_multi: jq .inconsistents 2026-03-08T23:31:26.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:803: _scrub_snaps_multi: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:31:26.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1066: _scrub_snaps_multi: jq .inconsistents td/osd-scrub-snaps/json 2026-03-08T23:31:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1066: _scrub_snaps_multi: python3 -c 'import json; import sys ; JSON=sys.stdin.read() ; ud = json.loads(JSON) ; print ( json.dumps(ud, sort_keys=True, indent=2) )' 2026-03-08T23:31:26.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1067: _scrub_snaps_multi: multidiff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:31:26.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2489: multidiff: diff td/osd-scrub-snaps/checkcsjson td/osd-scrub-snaps/csjson 2026-03-08T23:31:26.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1068: _scrub_snaps_multi: test no = yes 2026-03-08T23:31:26.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1073: _scrub_snaps_multi: test '' = yes 2026-03-08T23:31:26.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: find td/osd-scrub-snaps 2026-03-08T23:31:26.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: grep 'osd[^/]*\.pid' 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1078: _scrub_snaps_multi: pidfiles='td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:td/osd-scrub-snaps/osd.1.pid' 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1079: _scrub_snaps_multi: pids= 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1080: _scrub_snaps_multi: for pidfile in ${pidfiles} 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: cat td/osd-scrub-snaps/osd.0.pid 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: pids+='473667 ' 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1080: _scrub_snaps_multi: for pidfile in ${pidfiles} 2026-03-08T23:31:26.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: cat td/osd-scrub-snaps/osd.1.pid 2026-03-08T23:31:26.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1082: _scrub_snaps_multi: pids+='474231 ' 2026-03-08T23:31:26.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1085: _scrub_snaps_multi: ERRORS=0 2026-03-08T23:31:26.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1089: _scrub_snaps_multi: '[' replica = primary ']' 2026-03-08T23:31:26.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1111: _scrub_snaps_multi: ceph pg dump pgs 2026-03-08T23:31:26.899 INFO:tasks.workunit.client.0.vm03.stdout:PG_STAT OBJECTS MISSING_ON_PRIMARY DEGRADED MISPLACED UNFOUND BYTES OMAP_BYTES* OMAP_KEYS* LOG LOG_DUPS DISK_LOG STATE STATE_STAMP VERSION REPORTED UP UP_PRIMARY ACTING ACTING_PRIMARY LAST_SCRUB SCRUB_STAMP LAST_DEEP_SCRUB DEEP_SCRUB_STAMP SNAPTRIMQ_LEN LAST_SCRUB_DURATION SCRUB_SCHEDULING OBJECTS_SCRUBBED OBJECTS_TRIMMED 2026-03-08T23:31:26.899 INFO:tasks.workunit.client.0.vm03.stdout:1.0 36 0 0 0 0 24448 0 0 56 0 56 active+clean+inconsistent 2026-03-08T23:31:24.875311+0000 23'56 30:105 [1,0] 1 [1,0] 1 23'56 2026-03-08T23:31:24.875268+0000 0'0 2026-03-08T23:30:20.499761+0000 0 1 periodic scrub scheduled @ 2026-03-10T06:00:08.654121+0000 36 0 2026-03-08T23:31:26.899 INFO:tasks.workunit.client.0.vm03.stdout: 2026-03-08T23:31:26.899 INFO:tasks.workunit.client.0.vm03.stdout:* NOTE: Omap statistics are gathered during deep scrub and may be inaccurate soon afterwards depending on utilization. See http://docs.ceph.com/en/latest/dev/placement-group/#omap-statistics for further details. 2026-03-08T23:31:26.899 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:31:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1113: _scrub_snaps_multi: for pid in $pids 2026-03-08T23:31:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1115: _scrub_snaps_multi: kill -0 473667 2026-03-08T23:31:26.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1113: _scrub_snaps_multi: for pid in $pids 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1115: _scrub_snaps_multi: kill -0 474231 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1122: _scrub_snaps_multi: kill_daemons td/osd-scrub-snaps 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:31:26.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1124: _scrub_snaps_multi: declare -a err_strings 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1125: _scrub_snaps_multi: err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj4:7 : missing' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1126: _scrub_snaps_multi: err_strings[1]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj3:head : size 3840 != size 768 from auth oi' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1127: _scrub_snaps_multi: err_strings[2]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:1 : missing' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1128: _scrub_snaps_multi: err_strings[3]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:2 : missing' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1129: _scrub_snaps_multi: err_strings[4]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj5:4 : size 4608 != size 512 from auth oi' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1130: _scrub_snaps_multi: err_strings[5]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid .*:::obj5:7 : failed to pick suitable object info' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1131: _scrub_snaps_multi: err_strings[6]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj1:head : missing' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1132: _scrub_snaps_multi: err_strings[7]='log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 20 errors' 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj4:7 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj3:head : size 3840 != size 768 from auth oi' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:1 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj5:2 : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] soid .*:::obj5:4 : size 4608 != size 512 from auth oi' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 soid .*:::obj5:7 : failed to pick suitable object info' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 shard [0-1] .*:::obj1:head : missing' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1134: _scrub_snaps_multi: for err_string in "${err_strings[@]}" 2026-03-08T23:31:32.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1136: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : [0-9]*[.]0 scrub 20 errors' td/osd-scrub-snaps/osd.1.log 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1144: _scrub_snaps_multi: declare -a rep_err_strings 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: eval echo '$replica' 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: echo 0 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1145: _scrub_snaps_multi: osd=0 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1146: _scrub_snaps_multi: rep_err_strings[0]='log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1147: _scrub_snaps_multi: for err_string in "${rep_err_strings[@]}" 2026-03-08T23:31:32.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1149: _scrub_snaps_multi: grep 'log_channel[(]cluster[)] log [[]ERR[]] : osd[.][0-9]* found snap mapper error on pg 1.0 oid 1:461f8b5e:::obj16:7 snaps missing in mapper, should be: {1, 2, 3, 4, 5, 6, 7} ...repaired' td/osd-scrub-snaps/osd.0.log 2026-03-08T23:31:32.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1156: _scrub_snaps_multi: '[' 0 '!=' 0 ']' 2026-03-08T23:31:32.244 INFO:tasks.workunit.client.0.vm03.stdout:TEST PASSED 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1162: _scrub_snaps_multi: echo 'TEST PASSED' 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1163: _scrub_snaps_multi: return 0 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1171: TEST_scrub_snaps_replica: err=0 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1172: TEST_scrub_snaps_replica: CEPH_ARGS='--fsid=01b10fa1-bca9-4d6e-99f5-658e1c565807 --auth-supported=none --mon-host=127.0.0.1:7121 ' 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:1173: TEST_scrub_snaps_replica: return 0 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-snaps.sh:40: run: teardown td/osd-scrub-snaps 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:31:32.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:31:32.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:31:32.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:31:32.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:31:32.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:31:32.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:31:32.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:31:32.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:31:32.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:31:32.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:31:32.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:31:32.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:31:32.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:31:32.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:31:32.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:31:32.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-scrub-snaps 0 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-snaps 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T23:31:32.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-snaps KILL 2026-03-08T23:31:32.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:31:32.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:31:32.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:31:32.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:31:32.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:31:32.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:31:32.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:31:32.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:31:32.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:31:32.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:31:32.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:31:32.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:31:32.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:31:32.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:31:32.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:31:32.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T23:31:32.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-snaps 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.420670 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.420670 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:31:32.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:31:32.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T23:31:32.281 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T23:31:32.281 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T23:31:32.334 INFO:tasks.workunit:Running workunit scrub/osd-scrub-test.sh... 2026-03-08T23:31:32.334 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-scrub-test.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh 2026-03-08T23:31:32.383 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T23:31:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-scrub-test 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:21: run: local dir=td/osd-scrub-test 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:22: run: shift 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:24: run: export CEPH_MON=127.0.0.1:7138 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:24: run: CEPH_MON=127.0.0.1:7138 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:25: run: export CEPH_ARGS 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:26: run: uuidgen 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:26: run: CEPH_ARGS+='--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none ' 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:27: run: CEPH_ARGS+='--mon-host=127.0.0.1:7138 ' 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:29: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T23:31:32.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:30: run: set 2026-03-08T23:31:32.388 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:30: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_deep_scrub_abort ------------------- 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:30: run: local 'funcs=TEST_deep_scrub_abort 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_dump_scrub_schedule 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_interval_changes 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_just_deep_scrubs 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_pg_dump_objects_scrubbed 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_abort 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_extended_sleep 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_permit_time 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:TEST_scrub_test' 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_deep_scrub_abort -------------------' 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:31:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:31:32.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:31:32.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:31:32.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:31:32.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:31:32.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:31:32.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:31:32.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:31:32.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:31:32.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:31:32.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:31:32.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:31:32.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:31:32.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:31:32.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:31:32.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:31:32.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:31:32.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:31:32.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T23:31:32.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_deep_scrub_abort ----------------------- 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_deep_scrub_abort -----------------------' 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_deep_scrub_abort td/osd-scrub-test 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:414: TEST_deep_scrub_abort: local dir=td/osd-scrub-test 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:415: TEST_deep_scrub_abort: _scrub_abort td/osd-scrub-test deep-scrub 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:298: _scrub_abort: local dir=td/osd-scrub-test 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:299: _scrub_abort: local poolname=test 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:300: _scrub_abort: local OSDS=3 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:301: _scrub_abort: local objects=1000 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:302: _scrub_abort: local type=deep-scrub 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:304: _scrub_abort: TESTDATA=testdata.475827 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:305: _scrub_abort: test deep-scrub = scrub 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:310: _scrub_abort: stopscrub=nodeep-scrub 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:311: _scrub_abort: check=nodeep_scrub 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:314: _scrub_abort: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:31:32.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:31:32.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:31:32.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:31:32.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:31:32.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:31:32.653 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:32.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:31:32.682 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:31:32.683 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:31:32.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:31:32.683 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:31:32.683 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.683 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.684 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:31:32.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:31:32.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.760 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:31:32.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:31:32.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:31:32.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:315: _scrub_abort: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:31:32.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:32.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:31:32.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:31:32.975 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: expr 3 - 1 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: seq 0 2 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 0 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:31:32.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:31:32.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:31:32.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2664f748-46f9-48ab-a759-db5c746c63ea 2026-03-08T23:31:32.984 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 2664f748-46f9-48ab-a759-db5c746c63ea 2026-03-08T23:31:32.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 2664f748-46f9-48ab-a759-db5c746c63ea' 2026-03-08T23:31:32.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:31:32.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDVBq5p5mVAABAAJbMnu4q/hbEhuq+ibzpv/A== 2026-03-08T23:31:32.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDVBq5p5mVAABAAJbMnu4q/hbEhuq+ibzpv/A=="}' 2026-03-08T23:31:32.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2664f748-46f9-48ab-a759-db5c746c63ea -i td/osd-scrub-test/0/new.json 2026-03-08T23:31:33.159 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:33.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:31:33.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQDVBq5p5mVAABAAJbMnu4q/hbEhuq+ibzpv/A== --osd-uuid 2664f748-46f9-48ab-a759-db5c746c63ea 2026-03-08T23:31:33.224 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:33.225+0000 7f080d1208c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:33.293 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:33.297+0000 7f080d1208c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:33.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:33.297+0000 7f080d1208c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:33.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:33.297+0000 7f080d1208c0 -1 bdev(0x561f09644c00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:31:33.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:33.297+0000 7f080d1208c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:31:35.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:31:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:31:35.809 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:31:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:31:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:31:36.002 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:31:36.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:31:36.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:36.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:31:36.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:31:36.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:31:36.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:36.021+0000 7f148fc198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:36.019 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:36.021+0000 7f148fc198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:36.021 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:36.025+0000 7f148fc198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:36.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:36.981 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:36.985+0000 7f148fc198c0 -1 Falling back to public interface 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:37.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:37.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:38.201 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.204+0000 7f148fc198c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:38.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:31:38.687 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2332668265,v1:127.0.0.1:6803/2332668265] [v2:127.0.0.1:6804/2332668265,v1:127.0.0.1:6805/2332668265] exists,up 2664f748-46f9-48ab-a759-db5c746c63ea 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 1 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:31:38.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:31:38.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:31:38.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:31:38.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=aec81535-9914-48bc-86fe-48dc81b6eded 2026-03-08T23:31:38.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 aec81535-9914-48bc-86fe-48dc81b6eded' 2026-03-08T23:31:38.691 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 aec81535-9914-48bc-86fe-48dc81b6eded 2026-03-08T23:31:38.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:31:38.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDaBq5pOLJOKhAAsLBymWXQk97ZZfhvCNZS9g== 2026-03-08T23:31:38.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDaBq5pOLJOKhAAsLBymWXQk97ZZfhvCNZS9g=="}' 2026-03-08T23:31:38.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new aec81535-9914-48bc-86fe-48dc81b6eded -i td/osd-scrub-test/1/new.json 2026-03-08T23:31:38.864 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:38.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:31:38.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQDaBq5pOLJOKhAAsLBymWXQk97ZZfhvCNZS9g== --osd-uuid aec81535-9914-48bc-86fe-48dc81b6eded 2026-03-08T23:31:38.892 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.896+0000 7f13c449e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:38.894 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.896+0000 7f13c449e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:38.895 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.896+0000 7f13c449e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:38.895 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.896+0000 7f13c449e8c0 -1 bdev(0x55c31900fc00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:31:38.895 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:38.900+0000 7f13c449e8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:31:41.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:31:41.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:31:41.405 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:31:41.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:31:41.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:31:41.618 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:31:41.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:31:41.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:41.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:31:41.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:31:41.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:31:41.635 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:41.636+0000 7f21a05a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:41.636 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:41.640+0000 7f21a05a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:41.637 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:41.640+0000 7f21a05a28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:41.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:41.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:42.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:42.844+0000 7f21a05a28c0 -1 Falling back to public interface 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:42.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:43.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:44.054 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:44.056+0000 7f21a05a28c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:44.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:44.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:45.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1316181536,v1:127.0.0.1:6811/1316181536] [v2:127.0.0.1:6812/1316181536,v1:127.0.0.1:6813/1316181536] exists,up aec81535-9914-48bc-86fe-48dc81b6eded 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 2 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:31:45.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:31:45.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:31:45.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:31:45.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:31:45.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=df687553-09db-48d1-86ed-7d662dc64654 2026-03-08T23:31:45.482 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 df687553-09db-48d1-86ed-7d662dc64654 2026-03-08T23:31:45.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 df687553-09db-48d1-86ed-7d662dc64654' 2026-03-08T23:31:45.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:31:45.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDhBq5pctjbHRAAzNJCttICItz4QEqrRmRmvA== 2026-03-08T23:31:45.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDhBq5pctjbHRAAzNJCttICItz4QEqrRmRmvA=="}' 2026-03-08T23:31:45.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new df687553-09db-48d1-86ed-7d662dc64654 -i td/osd-scrub-test/2/new.json 2026-03-08T23:31:45.693 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:45.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:31:45.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQDhBq5pctjbHRAAzNJCttICItz4QEqrRmRmvA== --osd-uuid df687553-09db-48d1-86ed-7d662dc64654 2026-03-08T23:31:45.727 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:45.728+0000 7f3ec26438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:45.729 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:45.732+0000 7f3ec26438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:45.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:45.732+0000 7f3ec26438c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:45.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:45.732+0000 7f3ec26438c0 -1 bdev(0x56161eeb7c00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:31:45.730 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:45.732+0000 7f3ec26438c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:31:48.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:31:48.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:31:48.297 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:31:48.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:31:48.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:31:48.500 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:31:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:31:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:31:48.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:31:48.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:31:48.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:31:48.516 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:48.516+0000 7efc24d418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:48.522 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:48.524+0000 7efc24d418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:48.523 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:48.524+0000 7efc24d418c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:31:48.679 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:48.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:31:48.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:49.469 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:49.472+0000 7efc24d418c0 -1 Falling back to public interface 2026-03-08T23:31:49.845 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:31:49.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:49.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:49.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:31:49.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:49.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:31:50.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:31:50.685 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:31:50.688+0000 7efc24d418c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:31:51.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/1869901562,v1:127.0.0.1:6819/1869901562] [v2:127.0.0.1:6820/1869901562,v1:127.0.0.1:6821/1869901562] exists,up df687553-09db-48d1-86ed-7d662dc64654 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:330: _scrub_abort: create_pool test 1 1 2026-03-08T23:31:51.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:31:51.400 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:31:51.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:31:52.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:331: _scrub_abort: wait_for_clean 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:31:52.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:31:52.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:52.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:31:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836495 2026-03-08T23:31:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836495 2026-03-08T23:31:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495' 2026-03-08T23:31:52.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:52.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:31:52.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672969 2026-03-08T23:31:52.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672969 2026-03-08T23:31:52.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672969' 2026-03-08T23:31:52.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:31:52.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542147 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542147 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672969 2-60129542147' 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836495 2026-03-08T23:31:52.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:52.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:31:52.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836495 2026-03-08T23:31:52.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:52.919 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836495 2026-03-08T23:31:52.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836495 2026-03-08T23:31:52.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836495' 2026-03-08T23:31:52.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:53.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836495 2026-03-08T23:31:53.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:31:54.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:31:54.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:31:54.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836495 2026-03-08T23:31:54.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:54.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672969 2026-03-08T23:31:54.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:54.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:31:54.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672969 2026-03-08T23:31:54.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:54.266 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672969 2026-03-08T23:31:54.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672969 2026-03-08T23:31:54.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672969' 2026-03-08T23:31:54.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:31:54.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672969 2026-03-08T23:31:54.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:31:54.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542147 2026-03-08T23:31:54.443 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:31:54.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:31:54.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542147 2026-03-08T23:31:54.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:31:54.446 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542147 2026-03-08T23:31:54.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542147 2026-03-08T23:31:54.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542147' 2026-03-08T23:31:54.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:31:54.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542147 2026-03-08T23:31:54.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:31:54.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:31:54.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:31:54.819 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:31:54.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:31:54.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:31:54.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:31:54.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:31:54.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:31:55.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:31:55.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:31:55.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:31:55.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: ceph osd dump 2026-03-08T23:31:55.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: awk '{ print $2 }' 2026-03-08T23:31:55.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: grep '^pool.*['\'']test['\'']' 2026-03-08T23:31:55.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: poolid=1 2026-03-08T23:31:55.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:334: _scrub_abort: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:31:55.371 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:31:55.371 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:31:55.371 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 7.0562e-05 s, 14.6 MB/s 2026-03-08T23:31:55.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: seq 1 1000 2026-03-08T23:31:55.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj1 testdata.475827 2026-03-08T23:31:55.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj2 testdata.475827 2026-03-08T23:31:55.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj3 testdata.475827 2026-03-08T23:31:55.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj4 testdata.475827 2026-03-08T23:31:55.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj5 testdata.475827 2026-03-08T23:31:55.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj6 testdata.475827 2026-03-08T23:31:55.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj7 testdata.475827 2026-03-08T23:31:55.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj8 testdata.475827 2026-03-08T23:31:55.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj9 testdata.475827 2026-03-08T23:31:55.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj10 testdata.475827 2026-03-08T23:31:55.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj11 testdata.475827 2026-03-08T23:31:55.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj12 testdata.475827 2026-03-08T23:31:55.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj13 testdata.475827 2026-03-08T23:31:55.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj14 testdata.475827 2026-03-08T23:31:55.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj15 testdata.475827 2026-03-08T23:31:55.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj16 testdata.475827 2026-03-08T23:31:55.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj17 testdata.475827 2026-03-08T23:31:55.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj18 testdata.475827 2026-03-08T23:31:55.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj19 testdata.475827 2026-03-08T23:31:55.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj20 testdata.475827 2026-03-08T23:31:55.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj21 testdata.475827 2026-03-08T23:31:55.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj22 testdata.475827 2026-03-08T23:31:55.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj23 testdata.475827 2026-03-08T23:31:55.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj24 testdata.475827 2026-03-08T23:31:55.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj25 testdata.475827 2026-03-08T23:31:55.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj26 testdata.475827 2026-03-08T23:31:55.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj27 testdata.475827 2026-03-08T23:31:55.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:55.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj28 testdata.475827 2026-03-08T23:31:56.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj29 testdata.475827 2026-03-08T23:31:56.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj30 testdata.475827 2026-03-08T23:31:56.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj31 testdata.475827 2026-03-08T23:31:56.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj32 testdata.475827 2026-03-08T23:31:56.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj33 testdata.475827 2026-03-08T23:31:56.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj34 testdata.475827 2026-03-08T23:31:56.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj35 testdata.475827 2026-03-08T23:31:56.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj36 testdata.475827 2026-03-08T23:31:56.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj37 testdata.475827 2026-03-08T23:31:56.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj38 testdata.475827 2026-03-08T23:31:56.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj39 testdata.475827 2026-03-08T23:31:56.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj40 testdata.475827 2026-03-08T23:31:56.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj41 testdata.475827 2026-03-08T23:31:56.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj42 testdata.475827 2026-03-08T23:31:56.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj43 testdata.475827 2026-03-08T23:31:56.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj44 testdata.475827 2026-03-08T23:31:56.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj45 testdata.475827 2026-03-08T23:31:56.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj46 testdata.475827 2026-03-08T23:31:56.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj47 testdata.475827 2026-03-08T23:31:56.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj48 testdata.475827 2026-03-08T23:31:56.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj49 testdata.475827 2026-03-08T23:31:56.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj50 testdata.475827 2026-03-08T23:31:56.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj51 testdata.475827 2026-03-08T23:31:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj52 testdata.475827 2026-03-08T23:31:56.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj53 testdata.475827 2026-03-08T23:31:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj54 testdata.475827 2026-03-08T23:31:56.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj55 testdata.475827 2026-03-08T23:31:56.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj56 testdata.475827 2026-03-08T23:31:56.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj57 testdata.475827 2026-03-08T23:31:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj58 testdata.475827 2026-03-08T23:31:56.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj59 testdata.475827 2026-03-08T23:31:56.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj60 testdata.475827 2026-03-08T23:31:56.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj61 testdata.475827 2026-03-08T23:31:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj62 testdata.475827 2026-03-08T23:31:56.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj63 testdata.475827 2026-03-08T23:31:56.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj64 testdata.475827 2026-03-08T23:31:56.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj65 testdata.475827 2026-03-08T23:31:56.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj66 testdata.475827 2026-03-08T23:31:56.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj67 testdata.475827 2026-03-08T23:31:56.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj68 testdata.475827 2026-03-08T23:31:56.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj69 testdata.475827 2026-03-08T23:31:56.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj70 testdata.475827 2026-03-08T23:31:56.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj71 testdata.475827 2026-03-08T23:31:56.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:56.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj72 testdata.475827 2026-03-08T23:31:57.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj73 testdata.475827 2026-03-08T23:31:57.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj74 testdata.475827 2026-03-08T23:31:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj75 testdata.475827 2026-03-08T23:31:57.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj76 testdata.475827 2026-03-08T23:31:57.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj77 testdata.475827 2026-03-08T23:31:57.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj78 testdata.475827 2026-03-08T23:31:57.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj79 testdata.475827 2026-03-08T23:31:57.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj80 testdata.475827 2026-03-08T23:31:57.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj81 testdata.475827 2026-03-08T23:31:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj82 testdata.475827 2026-03-08T23:31:57.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj83 testdata.475827 2026-03-08T23:31:57.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj84 testdata.475827 2026-03-08T23:31:57.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj85 testdata.475827 2026-03-08T23:31:57.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj86 testdata.475827 2026-03-08T23:31:57.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj87 testdata.475827 2026-03-08T23:31:57.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj88 testdata.475827 2026-03-08T23:31:57.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj89 testdata.475827 2026-03-08T23:31:57.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj90 testdata.475827 2026-03-08T23:31:57.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj91 testdata.475827 2026-03-08T23:31:57.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj92 testdata.475827 2026-03-08T23:31:57.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj93 testdata.475827 2026-03-08T23:31:57.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj94 testdata.475827 2026-03-08T23:31:57.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj95 testdata.475827 2026-03-08T23:31:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj96 testdata.475827 2026-03-08T23:31:57.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj97 testdata.475827 2026-03-08T23:31:57.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj98 testdata.475827 2026-03-08T23:31:57.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj99 testdata.475827 2026-03-08T23:31:57.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj100 testdata.475827 2026-03-08T23:31:57.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj101 testdata.475827 2026-03-08T23:31:57.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj102 testdata.475827 2026-03-08T23:31:57.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj103 testdata.475827 2026-03-08T23:31:57.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj104 testdata.475827 2026-03-08T23:31:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj105 testdata.475827 2026-03-08T23:31:57.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj106 testdata.475827 2026-03-08T23:31:57.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj107 testdata.475827 2026-03-08T23:31:57.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj108 testdata.475827 2026-03-08T23:31:57.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj109 testdata.475827 2026-03-08T23:31:57.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj110 testdata.475827 2026-03-08T23:31:57.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj111 testdata.475827 2026-03-08T23:31:57.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj112 testdata.475827 2026-03-08T23:31:57.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj113 testdata.475827 2026-03-08T23:31:57.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj114 testdata.475827 2026-03-08T23:31:57.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj115 testdata.475827 2026-03-08T23:31:57.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj116 testdata.475827 2026-03-08T23:31:57.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:57.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj117 testdata.475827 2026-03-08T23:31:58.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj118 testdata.475827 2026-03-08T23:31:58.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj119 testdata.475827 2026-03-08T23:31:58.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj120 testdata.475827 2026-03-08T23:31:58.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj121 testdata.475827 2026-03-08T23:31:58.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj122 testdata.475827 2026-03-08T23:31:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj123 testdata.475827 2026-03-08T23:31:58.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj124 testdata.475827 2026-03-08T23:31:58.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj125 testdata.475827 2026-03-08T23:31:58.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj126 testdata.475827 2026-03-08T23:31:58.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj127 testdata.475827 2026-03-08T23:31:58.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj128 testdata.475827 2026-03-08T23:31:58.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj129 testdata.475827 2026-03-08T23:31:58.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj130 testdata.475827 2026-03-08T23:31:58.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj131 testdata.475827 2026-03-08T23:31:58.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj132 testdata.475827 2026-03-08T23:31:58.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj133 testdata.475827 2026-03-08T23:31:58.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj134 testdata.475827 2026-03-08T23:31:58.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj135 testdata.475827 2026-03-08T23:31:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj136 testdata.475827 2026-03-08T23:31:58.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj137 testdata.475827 2026-03-08T23:31:58.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj138 testdata.475827 2026-03-08T23:31:58.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj139 testdata.475827 2026-03-08T23:31:58.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj140 testdata.475827 2026-03-08T23:31:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj141 testdata.475827 2026-03-08T23:31:58.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj142 testdata.475827 2026-03-08T23:31:58.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj143 testdata.475827 2026-03-08T23:31:58.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj144 testdata.475827 2026-03-08T23:31:58.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj145 testdata.475827 2026-03-08T23:31:58.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj146 testdata.475827 2026-03-08T23:31:58.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj147 testdata.475827 2026-03-08T23:31:58.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj148 testdata.475827 2026-03-08T23:31:58.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj149 testdata.475827 2026-03-08T23:31:58.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj150 testdata.475827 2026-03-08T23:31:58.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj151 testdata.475827 2026-03-08T23:31:58.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj152 testdata.475827 2026-03-08T23:31:58.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj153 testdata.475827 2026-03-08T23:31:58.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj154 testdata.475827 2026-03-08T23:31:58.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj155 testdata.475827 2026-03-08T23:31:58.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj156 testdata.475827 2026-03-08T23:31:58.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj157 testdata.475827 2026-03-08T23:31:58.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj158 testdata.475827 2026-03-08T23:31:58.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj159 testdata.475827 2026-03-08T23:31:58.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:58.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj160 testdata.475827 2026-03-08T23:31:59.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj161 testdata.475827 2026-03-08T23:31:59.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj162 testdata.475827 2026-03-08T23:31:59.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj163 testdata.475827 2026-03-08T23:31:59.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj164 testdata.475827 2026-03-08T23:31:59.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj165 testdata.475827 2026-03-08T23:31:59.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj166 testdata.475827 2026-03-08T23:31:59.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj167 testdata.475827 2026-03-08T23:31:59.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj168 testdata.475827 2026-03-08T23:31:59.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj169 testdata.475827 2026-03-08T23:31:59.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj170 testdata.475827 2026-03-08T23:31:59.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj171 testdata.475827 2026-03-08T23:31:59.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj172 testdata.475827 2026-03-08T23:31:59.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj173 testdata.475827 2026-03-08T23:31:59.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj174 testdata.475827 2026-03-08T23:31:59.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj175 testdata.475827 2026-03-08T23:31:59.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj176 testdata.475827 2026-03-08T23:31:59.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj177 testdata.475827 2026-03-08T23:31:59.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj178 testdata.475827 2026-03-08T23:31:59.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj179 testdata.475827 2026-03-08T23:31:59.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj180 testdata.475827 2026-03-08T23:31:59.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj181 testdata.475827 2026-03-08T23:31:59.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj182 testdata.475827 2026-03-08T23:31:59.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj183 testdata.475827 2026-03-08T23:31:59.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj184 testdata.475827 2026-03-08T23:31:59.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj185 testdata.475827 2026-03-08T23:31:59.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj186 testdata.475827 2026-03-08T23:31:59.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj187 testdata.475827 2026-03-08T23:31:59.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj188 testdata.475827 2026-03-08T23:31:59.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj189 testdata.475827 2026-03-08T23:31:59.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj190 testdata.475827 2026-03-08T23:31:59.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj191 testdata.475827 2026-03-08T23:31:59.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj192 testdata.475827 2026-03-08T23:31:59.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj193 testdata.475827 2026-03-08T23:31:59.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj194 testdata.475827 2026-03-08T23:31:59.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj195 testdata.475827 2026-03-08T23:31:59.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj196 testdata.475827 2026-03-08T23:31:59.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj197 testdata.475827 2026-03-08T23:31:59.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj198 testdata.475827 2026-03-08T23:31:59.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj199 testdata.475827 2026-03-08T23:31:59.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:31:59.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj200 testdata.475827 2026-03-08T23:32:00.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj201 testdata.475827 2026-03-08T23:32:00.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj202 testdata.475827 2026-03-08T23:32:00.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj203 testdata.475827 2026-03-08T23:32:00.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj204 testdata.475827 2026-03-08T23:32:00.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj205 testdata.475827 2026-03-08T23:32:00.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj206 testdata.475827 2026-03-08T23:32:00.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj207 testdata.475827 2026-03-08T23:32:00.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj208 testdata.475827 2026-03-08T23:32:00.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj209 testdata.475827 2026-03-08T23:32:00.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj210 testdata.475827 2026-03-08T23:32:00.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj211 testdata.475827 2026-03-08T23:32:00.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj212 testdata.475827 2026-03-08T23:32:00.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj213 testdata.475827 2026-03-08T23:32:00.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj214 testdata.475827 2026-03-08T23:32:00.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj215 testdata.475827 2026-03-08T23:32:00.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj216 testdata.475827 2026-03-08T23:32:00.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj217 testdata.475827 2026-03-08T23:32:00.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj218 testdata.475827 2026-03-08T23:32:00.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj219 testdata.475827 2026-03-08T23:32:00.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj220 testdata.475827 2026-03-08T23:32:00.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj221 testdata.475827 2026-03-08T23:32:00.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj222 testdata.475827 2026-03-08T23:32:00.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj223 testdata.475827 2026-03-08T23:32:00.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj224 testdata.475827 2026-03-08T23:32:00.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj225 testdata.475827 2026-03-08T23:32:00.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj226 testdata.475827 2026-03-08T23:32:00.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj227 testdata.475827 2026-03-08T23:32:00.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj228 testdata.475827 2026-03-08T23:32:00.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj229 testdata.475827 2026-03-08T23:32:00.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj230 testdata.475827 2026-03-08T23:32:00.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj231 testdata.475827 2026-03-08T23:32:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj232 testdata.475827 2026-03-08T23:32:00.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj233 testdata.475827 2026-03-08T23:32:00.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj234 testdata.475827 2026-03-08T23:32:00.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj235 testdata.475827 2026-03-08T23:32:00.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj236 testdata.475827 2026-03-08T23:32:00.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:00.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj237 testdata.475827 2026-03-08T23:32:01.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj238 testdata.475827 2026-03-08T23:32:01.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj239 testdata.475827 2026-03-08T23:32:01.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj240 testdata.475827 2026-03-08T23:32:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj241 testdata.475827 2026-03-08T23:32:01.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj242 testdata.475827 2026-03-08T23:32:01.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj243 testdata.475827 2026-03-08T23:32:01.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj244 testdata.475827 2026-03-08T23:32:01.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj245 testdata.475827 2026-03-08T23:32:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj246 testdata.475827 2026-03-08T23:32:01.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj247 testdata.475827 2026-03-08T23:32:01.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj248 testdata.475827 2026-03-08T23:32:01.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj249 testdata.475827 2026-03-08T23:32:01.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj250 testdata.475827 2026-03-08T23:32:01.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj251 testdata.475827 2026-03-08T23:32:01.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj252 testdata.475827 2026-03-08T23:32:01.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj253 testdata.475827 2026-03-08T23:32:01.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj254 testdata.475827 2026-03-08T23:32:01.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj255 testdata.475827 2026-03-08T23:32:01.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj256 testdata.475827 2026-03-08T23:32:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj257 testdata.475827 2026-03-08T23:32:01.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj258 testdata.475827 2026-03-08T23:32:01.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj259 testdata.475827 2026-03-08T23:32:01.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj260 testdata.475827 2026-03-08T23:32:01.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj261 testdata.475827 2026-03-08T23:32:01.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj262 testdata.475827 2026-03-08T23:32:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj263 testdata.475827 2026-03-08T23:32:01.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj264 testdata.475827 2026-03-08T23:32:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj265 testdata.475827 2026-03-08T23:32:01.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj266 testdata.475827 2026-03-08T23:32:01.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj267 testdata.475827 2026-03-08T23:32:01.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj268 testdata.475827 2026-03-08T23:32:01.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj269 testdata.475827 2026-03-08T23:32:01.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj270 testdata.475827 2026-03-08T23:32:01.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj271 testdata.475827 2026-03-08T23:32:01.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj272 testdata.475827 2026-03-08T23:32:01.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj273 testdata.475827 2026-03-08T23:32:01.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj274 testdata.475827 2026-03-08T23:32:01.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj275 testdata.475827 2026-03-08T23:32:01.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj276 testdata.475827 2026-03-08T23:32:01.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj277 testdata.475827 2026-03-08T23:32:01.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj278 testdata.475827 2026-03-08T23:32:01.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj279 testdata.475827 2026-03-08T23:32:01.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:01.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj280 testdata.475827 2026-03-08T23:32:02.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj281 testdata.475827 2026-03-08T23:32:02.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj282 testdata.475827 2026-03-08T23:32:02.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj283 testdata.475827 2026-03-08T23:32:02.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj284 testdata.475827 2026-03-08T23:32:02.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj285 testdata.475827 2026-03-08T23:32:02.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj286 testdata.475827 2026-03-08T23:32:02.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj287 testdata.475827 2026-03-08T23:32:02.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj288 testdata.475827 2026-03-08T23:32:02.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj289 testdata.475827 2026-03-08T23:32:02.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj290 testdata.475827 2026-03-08T23:32:02.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj291 testdata.475827 2026-03-08T23:32:02.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj292 testdata.475827 2026-03-08T23:32:02.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj293 testdata.475827 2026-03-08T23:32:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj294 testdata.475827 2026-03-08T23:32:02.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj295 testdata.475827 2026-03-08T23:32:02.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj296 testdata.475827 2026-03-08T23:32:02.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj297 testdata.475827 2026-03-08T23:32:02.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj298 testdata.475827 2026-03-08T23:32:02.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj299 testdata.475827 2026-03-08T23:32:02.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj300 testdata.475827 2026-03-08T23:32:02.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj301 testdata.475827 2026-03-08T23:32:02.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj302 testdata.475827 2026-03-08T23:32:02.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj303 testdata.475827 2026-03-08T23:32:02.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj304 testdata.475827 2026-03-08T23:32:02.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj305 testdata.475827 2026-03-08T23:32:02.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj306 testdata.475827 2026-03-08T23:32:02.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj307 testdata.475827 2026-03-08T23:32:02.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj308 testdata.475827 2026-03-08T23:32:02.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj309 testdata.475827 2026-03-08T23:32:02.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj310 testdata.475827 2026-03-08T23:32:02.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj311 testdata.475827 2026-03-08T23:32:02.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj312 testdata.475827 2026-03-08T23:32:02.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj313 testdata.475827 2026-03-08T23:32:02.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj314 testdata.475827 2026-03-08T23:32:02.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj315 testdata.475827 2026-03-08T23:32:02.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj316 testdata.475827 2026-03-08T23:32:02.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj317 testdata.475827 2026-03-08T23:32:02.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj318 testdata.475827 2026-03-08T23:32:02.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj319 testdata.475827 2026-03-08T23:32:02.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj320 testdata.475827 2026-03-08T23:32:02.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj321 testdata.475827 2026-03-08T23:32:02.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj322 testdata.475827 2026-03-08T23:32:02.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:02.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj323 testdata.475827 2026-03-08T23:32:03.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj324 testdata.475827 2026-03-08T23:32:03.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj325 testdata.475827 2026-03-08T23:32:03.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj326 testdata.475827 2026-03-08T23:32:03.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj327 testdata.475827 2026-03-08T23:32:03.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj328 testdata.475827 2026-03-08T23:32:03.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj329 testdata.475827 2026-03-08T23:32:03.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj330 testdata.475827 2026-03-08T23:32:03.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj331 testdata.475827 2026-03-08T23:32:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj332 testdata.475827 2026-03-08T23:32:03.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj333 testdata.475827 2026-03-08T23:32:03.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj334 testdata.475827 2026-03-08T23:32:03.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj335 testdata.475827 2026-03-08T23:32:03.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj336 testdata.475827 2026-03-08T23:32:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj337 testdata.475827 2026-03-08T23:32:03.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj338 testdata.475827 2026-03-08T23:32:03.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj339 testdata.475827 2026-03-08T23:32:03.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj340 testdata.475827 2026-03-08T23:32:03.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj341 testdata.475827 2026-03-08T23:32:03.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj342 testdata.475827 2026-03-08T23:32:03.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj343 testdata.475827 2026-03-08T23:32:03.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj344 testdata.475827 2026-03-08T23:32:03.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj345 testdata.475827 2026-03-08T23:32:03.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj346 testdata.475827 2026-03-08T23:32:03.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj347 testdata.475827 2026-03-08T23:32:03.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj348 testdata.475827 2026-03-08T23:32:03.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj349 testdata.475827 2026-03-08T23:32:03.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj350 testdata.475827 2026-03-08T23:32:03.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj351 testdata.475827 2026-03-08T23:32:03.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj352 testdata.475827 2026-03-08T23:32:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj353 testdata.475827 2026-03-08T23:32:03.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj354 testdata.475827 2026-03-08T23:32:03.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj355 testdata.475827 2026-03-08T23:32:03.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj356 testdata.475827 2026-03-08T23:32:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj357 testdata.475827 2026-03-08T23:32:03.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj358 testdata.475827 2026-03-08T23:32:03.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj359 testdata.475827 2026-03-08T23:32:03.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj360 testdata.475827 2026-03-08T23:32:03.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:03.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj361 testdata.475827 2026-03-08T23:32:04.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj362 testdata.475827 2026-03-08T23:32:04.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj363 testdata.475827 2026-03-08T23:32:04.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj364 testdata.475827 2026-03-08T23:32:04.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj365 testdata.475827 2026-03-08T23:32:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj366 testdata.475827 2026-03-08T23:32:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj367 testdata.475827 2026-03-08T23:32:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj368 testdata.475827 2026-03-08T23:32:04.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj369 testdata.475827 2026-03-08T23:32:04.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj370 testdata.475827 2026-03-08T23:32:04.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj371 testdata.475827 2026-03-08T23:32:04.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj372 testdata.475827 2026-03-08T23:32:04.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj373 testdata.475827 2026-03-08T23:32:04.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj374 testdata.475827 2026-03-08T23:32:04.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj375 testdata.475827 2026-03-08T23:32:04.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj376 testdata.475827 2026-03-08T23:32:04.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj377 testdata.475827 2026-03-08T23:32:04.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj378 testdata.475827 2026-03-08T23:32:04.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj379 testdata.475827 2026-03-08T23:32:04.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj380 testdata.475827 2026-03-08T23:32:04.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj381 testdata.475827 2026-03-08T23:32:04.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj382 testdata.475827 2026-03-08T23:32:04.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj383 testdata.475827 2026-03-08T23:32:04.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj384 testdata.475827 2026-03-08T23:32:04.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj385 testdata.475827 2026-03-08T23:32:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj386 testdata.475827 2026-03-08T23:32:04.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj387 testdata.475827 2026-03-08T23:32:04.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj388 testdata.475827 2026-03-08T23:32:04.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj389 testdata.475827 2026-03-08T23:32:04.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj390 testdata.475827 2026-03-08T23:32:04.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:04.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj391 testdata.475827 2026-03-08T23:32:05.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj392 testdata.475827 2026-03-08T23:32:05.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj393 testdata.475827 2026-03-08T23:32:05.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj394 testdata.475827 2026-03-08T23:32:05.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj395 testdata.475827 2026-03-08T23:32:05.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj396 testdata.475827 2026-03-08T23:32:05.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj397 testdata.475827 2026-03-08T23:32:05.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj398 testdata.475827 2026-03-08T23:32:05.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj399 testdata.475827 2026-03-08T23:32:05.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj400 testdata.475827 2026-03-08T23:32:05.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj401 testdata.475827 2026-03-08T23:32:05.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj402 testdata.475827 2026-03-08T23:32:05.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj403 testdata.475827 2026-03-08T23:32:05.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj404 testdata.475827 2026-03-08T23:32:05.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj405 testdata.475827 2026-03-08T23:32:05.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj406 testdata.475827 2026-03-08T23:32:05.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj407 testdata.475827 2026-03-08T23:32:05.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj408 testdata.475827 2026-03-08T23:32:05.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj409 testdata.475827 2026-03-08T23:32:05.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj410 testdata.475827 2026-03-08T23:32:05.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj411 testdata.475827 2026-03-08T23:32:05.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj412 testdata.475827 2026-03-08T23:32:05.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj413 testdata.475827 2026-03-08T23:32:05.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj414 testdata.475827 2026-03-08T23:32:05.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj415 testdata.475827 2026-03-08T23:32:05.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj416 testdata.475827 2026-03-08T23:32:05.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj417 testdata.475827 2026-03-08T23:32:05.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj418 testdata.475827 2026-03-08T23:32:05.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj419 testdata.475827 2026-03-08T23:32:05.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj420 testdata.475827 2026-03-08T23:32:05.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj421 testdata.475827 2026-03-08T23:32:05.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj422 testdata.475827 2026-03-08T23:32:05.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj423 testdata.475827 2026-03-08T23:32:05.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj424 testdata.475827 2026-03-08T23:32:05.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj425 testdata.475827 2026-03-08T23:32:05.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj426 testdata.475827 2026-03-08T23:32:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj427 testdata.475827 2026-03-08T23:32:05.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj428 testdata.475827 2026-03-08T23:32:05.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj429 testdata.475827 2026-03-08T23:32:05.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj430 testdata.475827 2026-03-08T23:32:05.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj431 testdata.475827 2026-03-08T23:32:05.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj432 testdata.475827 2026-03-08T23:32:05.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj433 testdata.475827 2026-03-08T23:32:05.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj434 testdata.475827 2026-03-08T23:32:05.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj435 testdata.475827 2026-03-08T23:32:05.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:05.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj436 testdata.475827 2026-03-08T23:32:06.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj437 testdata.475827 2026-03-08T23:32:06.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj438 testdata.475827 2026-03-08T23:32:06.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj439 testdata.475827 2026-03-08T23:32:06.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj440 testdata.475827 2026-03-08T23:32:06.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj441 testdata.475827 2026-03-08T23:32:06.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj442 testdata.475827 2026-03-08T23:32:06.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj443 testdata.475827 2026-03-08T23:32:06.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj444 testdata.475827 2026-03-08T23:32:06.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj445 testdata.475827 2026-03-08T23:32:06.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj446 testdata.475827 2026-03-08T23:32:06.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj447 testdata.475827 2026-03-08T23:32:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj448 testdata.475827 2026-03-08T23:32:06.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj449 testdata.475827 2026-03-08T23:32:06.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj450 testdata.475827 2026-03-08T23:32:06.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj451 testdata.475827 2026-03-08T23:32:06.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj452 testdata.475827 2026-03-08T23:32:06.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj453 testdata.475827 2026-03-08T23:32:06.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj454 testdata.475827 2026-03-08T23:32:06.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj455 testdata.475827 2026-03-08T23:32:06.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj456 testdata.475827 2026-03-08T23:32:06.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj457 testdata.475827 2026-03-08T23:32:06.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj458 testdata.475827 2026-03-08T23:32:06.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj459 testdata.475827 2026-03-08T23:32:06.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj460 testdata.475827 2026-03-08T23:32:06.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj461 testdata.475827 2026-03-08T23:32:06.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj462 testdata.475827 2026-03-08T23:32:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj463 testdata.475827 2026-03-08T23:32:06.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj464 testdata.475827 2026-03-08T23:32:06.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj465 testdata.475827 2026-03-08T23:32:06.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj466 testdata.475827 2026-03-08T23:32:06.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj467 testdata.475827 2026-03-08T23:32:06.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj468 testdata.475827 2026-03-08T23:32:06.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj469 testdata.475827 2026-03-08T23:32:06.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj470 testdata.475827 2026-03-08T23:32:06.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj471 testdata.475827 2026-03-08T23:32:06.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj472 testdata.475827 2026-03-08T23:32:06.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj473 testdata.475827 2026-03-08T23:32:06.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj474 testdata.475827 2026-03-08T23:32:06.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj475 testdata.475827 2026-03-08T23:32:06.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj476 testdata.475827 2026-03-08T23:32:06.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj477 testdata.475827 2026-03-08T23:32:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj478 testdata.475827 2026-03-08T23:32:06.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj479 testdata.475827 2026-03-08T23:32:06.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj480 testdata.475827 2026-03-08T23:32:06.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj481 testdata.475827 2026-03-08T23:32:06.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:06.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj482 testdata.475827 2026-03-08T23:32:07.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj483 testdata.475827 2026-03-08T23:32:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj484 testdata.475827 2026-03-08T23:32:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj485 testdata.475827 2026-03-08T23:32:07.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj486 testdata.475827 2026-03-08T23:32:07.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj487 testdata.475827 2026-03-08T23:32:07.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj488 testdata.475827 2026-03-08T23:32:07.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj489 testdata.475827 2026-03-08T23:32:07.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj490 testdata.475827 2026-03-08T23:32:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj491 testdata.475827 2026-03-08T23:32:07.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj492 testdata.475827 2026-03-08T23:32:07.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj493 testdata.475827 2026-03-08T23:32:07.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj494 testdata.475827 2026-03-08T23:32:07.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj495 testdata.475827 2026-03-08T23:32:07.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj496 testdata.475827 2026-03-08T23:32:07.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj497 testdata.475827 2026-03-08T23:32:07.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj498 testdata.475827 2026-03-08T23:32:07.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj499 testdata.475827 2026-03-08T23:32:07.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj500 testdata.475827 2026-03-08T23:32:07.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj501 testdata.475827 2026-03-08T23:32:07.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj502 testdata.475827 2026-03-08T23:32:07.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj503 testdata.475827 2026-03-08T23:32:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj504 testdata.475827 2026-03-08T23:32:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj505 testdata.475827 2026-03-08T23:32:07.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj506 testdata.475827 2026-03-08T23:32:07.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj507 testdata.475827 2026-03-08T23:32:07.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj508 testdata.475827 2026-03-08T23:32:07.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj509 testdata.475827 2026-03-08T23:32:07.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj510 testdata.475827 2026-03-08T23:32:07.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj511 testdata.475827 2026-03-08T23:32:07.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj512 testdata.475827 2026-03-08T23:32:07.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj513 testdata.475827 2026-03-08T23:32:07.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj514 testdata.475827 2026-03-08T23:32:07.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj515 testdata.475827 2026-03-08T23:32:07.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj516 testdata.475827 2026-03-08T23:32:07.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj517 testdata.475827 2026-03-08T23:32:07.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj518 testdata.475827 2026-03-08T23:32:07.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj519 testdata.475827 2026-03-08T23:32:07.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj520 testdata.475827 2026-03-08T23:32:07.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj521 testdata.475827 2026-03-08T23:32:07.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj522 testdata.475827 2026-03-08T23:32:07.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj523 testdata.475827 2026-03-08T23:32:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj524 testdata.475827 2026-03-08T23:32:07.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj525 testdata.475827 2026-03-08T23:32:07.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj526 testdata.475827 2026-03-08T23:32:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:07.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj527 testdata.475827 2026-03-08T23:32:08.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj528 testdata.475827 2026-03-08T23:32:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj529 testdata.475827 2026-03-08T23:32:08.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj530 testdata.475827 2026-03-08T23:32:08.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj531 testdata.475827 2026-03-08T23:32:08.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj532 testdata.475827 2026-03-08T23:32:08.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj533 testdata.475827 2026-03-08T23:32:08.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj534 testdata.475827 2026-03-08T23:32:08.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj535 testdata.475827 2026-03-08T23:32:08.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj536 testdata.475827 2026-03-08T23:32:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj537 testdata.475827 2026-03-08T23:32:08.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj538 testdata.475827 2026-03-08T23:32:08.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj539 testdata.475827 2026-03-08T23:32:08.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj540 testdata.475827 2026-03-08T23:32:08.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj541 testdata.475827 2026-03-08T23:32:08.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj542 testdata.475827 2026-03-08T23:32:08.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj543 testdata.475827 2026-03-08T23:32:08.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj544 testdata.475827 2026-03-08T23:32:08.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj545 testdata.475827 2026-03-08T23:32:08.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj546 testdata.475827 2026-03-08T23:32:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj547 testdata.475827 2026-03-08T23:32:08.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj548 testdata.475827 2026-03-08T23:32:08.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj549 testdata.475827 2026-03-08T23:32:08.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj550 testdata.475827 2026-03-08T23:32:08.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj551 testdata.475827 2026-03-08T23:32:08.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj552 testdata.475827 2026-03-08T23:32:08.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj553 testdata.475827 2026-03-08T23:32:08.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj554 testdata.475827 2026-03-08T23:32:08.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj555 testdata.475827 2026-03-08T23:32:08.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj556 testdata.475827 2026-03-08T23:32:08.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj557 testdata.475827 2026-03-08T23:32:08.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj558 testdata.475827 2026-03-08T23:32:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj559 testdata.475827 2026-03-08T23:32:08.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj560 testdata.475827 2026-03-08T23:32:08.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj561 testdata.475827 2026-03-08T23:32:08.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj562 testdata.475827 2026-03-08T23:32:08.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj563 testdata.475827 2026-03-08T23:32:08.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj564 testdata.475827 2026-03-08T23:32:08.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj565 testdata.475827 2026-03-08T23:32:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj566 testdata.475827 2026-03-08T23:32:08.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj567 testdata.475827 2026-03-08T23:32:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj568 testdata.475827 2026-03-08T23:32:08.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj569 testdata.475827 2026-03-08T23:32:08.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj570 testdata.475827 2026-03-08T23:32:08.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj571 testdata.475827 2026-03-08T23:32:08.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj572 testdata.475827 2026-03-08T23:32:08.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:08.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj573 testdata.475827 2026-03-08T23:32:09.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj574 testdata.475827 2026-03-08T23:32:09.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj575 testdata.475827 2026-03-08T23:32:09.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj576 testdata.475827 2026-03-08T23:32:09.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj577 testdata.475827 2026-03-08T23:32:09.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj578 testdata.475827 2026-03-08T23:32:09.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.124 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj579 testdata.475827 2026-03-08T23:32:09.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj580 testdata.475827 2026-03-08T23:32:09.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj581 testdata.475827 2026-03-08T23:32:09.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj582 testdata.475827 2026-03-08T23:32:09.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj583 testdata.475827 2026-03-08T23:32:09.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj584 testdata.475827 2026-03-08T23:32:09.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj585 testdata.475827 2026-03-08T23:32:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj586 testdata.475827 2026-03-08T23:32:09.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj587 testdata.475827 2026-03-08T23:32:09.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj588 testdata.475827 2026-03-08T23:32:09.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj589 testdata.475827 2026-03-08T23:32:09.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj590 testdata.475827 2026-03-08T23:32:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj591 testdata.475827 2026-03-08T23:32:09.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj592 testdata.475827 2026-03-08T23:32:09.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj593 testdata.475827 2026-03-08T23:32:09.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj594 testdata.475827 2026-03-08T23:32:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj595 testdata.475827 2026-03-08T23:32:09.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj596 testdata.475827 2026-03-08T23:32:09.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj597 testdata.475827 2026-03-08T23:32:09.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj598 testdata.475827 2026-03-08T23:32:09.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj599 testdata.475827 2026-03-08T23:32:09.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj600 testdata.475827 2026-03-08T23:32:09.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj601 testdata.475827 2026-03-08T23:32:09.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj602 testdata.475827 2026-03-08T23:32:09.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj603 testdata.475827 2026-03-08T23:32:09.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj604 testdata.475827 2026-03-08T23:32:09.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj605 testdata.475827 2026-03-08T23:32:09.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj606 testdata.475827 2026-03-08T23:32:09.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj607 testdata.475827 2026-03-08T23:32:09.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj608 testdata.475827 2026-03-08T23:32:09.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj609 testdata.475827 2026-03-08T23:32:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj610 testdata.475827 2026-03-08T23:32:09.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj611 testdata.475827 2026-03-08T23:32:09.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj612 testdata.475827 2026-03-08T23:32:09.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj613 testdata.475827 2026-03-08T23:32:09.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj614 testdata.475827 2026-03-08T23:32:09.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj615 testdata.475827 2026-03-08T23:32:09.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj616 testdata.475827 2026-03-08T23:32:09.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj617 testdata.475827 2026-03-08T23:32:09.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:09.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj618 testdata.475827 2026-03-08T23:32:10.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj619 testdata.475827 2026-03-08T23:32:10.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj620 testdata.475827 2026-03-08T23:32:10.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj621 testdata.475827 2026-03-08T23:32:10.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj622 testdata.475827 2026-03-08T23:32:10.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj623 testdata.475827 2026-03-08T23:32:10.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj624 testdata.475827 2026-03-08T23:32:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj625 testdata.475827 2026-03-08T23:32:10.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj626 testdata.475827 2026-03-08T23:32:10.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj627 testdata.475827 2026-03-08T23:32:10.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj628 testdata.475827 2026-03-08T23:32:10.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj629 testdata.475827 2026-03-08T23:32:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj630 testdata.475827 2026-03-08T23:32:10.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj631 testdata.475827 2026-03-08T23:32:10.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj632 testdata.475827 2026-03-08T23:32:10.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj633 testdata.475827 2026-03-08T23:32:10.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj634 testdata.475827 2026-03-08T23:32:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj635 testdata.475827 2026-03-08T23:32:10.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj636 testdata.475827 2026-03-08T23:32:10.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj637 testdata.475827 2026-03-08T23:32:10.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj638 testdata.475827 2026-03-08T23:32:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj639 testdata.475827 2026-03-08T23:32:10.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj640 testdata.475827 2026-03-08T23:32:10.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj641 testdata.475827 2026-03-08T23:32:10.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj642 testdata.475827 2026-03-08T23:32:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj643 testdata.475827 2026-03-08T23:32:10.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj644 testdata.475827 2026-03-08T23:32:10.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj645 testdata.475827 2026-03-08T23:32:10.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj646 testdata.475827 2026-03-08T23:32:10.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj647 testdata.475827 2026-03-08T23:32:10.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj648 testdata.475827 2026-03-08T23:32:10.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj649 testdata.475827 2026-03-08T23:32:10.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj650 testdata.475827 2026-03-08T23:32:10.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj651 testdata.475827 2026-03-08T23:32:10.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj652 testdata.475827 2026-03-08T23:32:10.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj653 testdata.475827 2026-03-08T23:32:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj654 testdata.475827 2026-03-08T23:32:10.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj655 testdata.475827 2026-03-08T23:32:10.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj656 testdata.475827 2026-03-08T23:32:10.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj657 testdata.475827 2026-03-08T23:32:10.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj658 testdata.475827 2026-03-08T23:32:10.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj659 testdata.475827 2026-03-08T23:32:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj660 testdata.475827 2026-03-08T23:32:10.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj661 testdata.475827 2026-03-08T23:32:10.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:10.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj662 testdata.475827 2026-03-08T23:32:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj663 testdata.475827 2026-03-08T23:32:11.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj664 testdata.475827 2026-03-08T23:32:11.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj665 testdata.475827 2026-03-08T23:32:11.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj666 testdata.475827 2026-03-08T23:32:11.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj667 testdata.475827 2026-03-08T23:32:11.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj668 testdata.475827 2026-03-08T23:32:11.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj669 testdata.475827 2026-03-08T23:32:11.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj670 testdata.475827 2026-03-08T23:32:11.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj671 testdata.475827 2026-03-08T23:32:11.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj672 testdata.475827 2026-03-08T23:32:11.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj673 testdata.475827 2026-03-08T23:32:11.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj674 testdata.475827 2026-03-08T23:32:11.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj675 testdata.475827 2026-03-08T23:32:11.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj676 testdata.475827 2026-03-08T23:32:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj677 testdata.475827 2026-03-08T23:32:11.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj678 testdata.475827 2026-03-08T23:32:11.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj679 testdata.475827 2026-03-08T23:32:11.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj680 testdata.475827 2026-03-08T23:32:11.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj681 testdata.475827 2026-03-08T23:32:11.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj682 testdata.475827 2026-03-08T23:32:11.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj683 testdata.475827 2026-03-08T23:32:11.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj684 testdata.475827 2026-03-08T23:32:11.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj685 testdata.475827 2026-03-08T23:32:11.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj686 testdata.475827 2026-03-08T23:32:11.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj687 testdata.475827 2026-03-08T23:32:11.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj688 testdata.475827 2026-03-08T23:32:11.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj689 testdata.475827 2026-03-08T23:32:11.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj690 testdata.475827 2026-03-08T23:32:11.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj691 testdata.475827 2026-03-08T23:32:11.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj692 testdata.475827 2026-03-08T23:32:11.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj693 testdata.475827 2026-03-08T23:32:11.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj694 testdata.475827 2026-03-08T23:32:11.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj695 testdata.475827 2026-03-08T23:32:11.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj696 testdata.475827 2026-03-08T23:32:11.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj697 testdata.475827 2026-03-08T23:32:11.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj698 testdata.475827 2026-03-08T23:32:11.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj699 testdata.475827 2026-03-08T23:32:11.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj700 testdata.475827 2026-03-08T23:32:11.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj701 testdata.475827 2026-03-08T23:32:11.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj702 testdata.475827 2026-03-08T23:32:11.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj703 testdata.475827 2026-03-08T23:32:11.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj704 testdata.475827 2026-03-08T23:32:11.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj705 testdata.475827 2026-03-08T23:32:11.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj706 testdata.475827 2026-03-08T23:32:11.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:11.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj707 testdata.475827 2026-03-08T23:32:12.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj708 testdata.475827 2026-03-08T23:32:12.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj709 testdata.475827 2026-03-08T23:32:12.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj710 testdata.475827 2026-03-08T23:32:12.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj711 testdata.475827 2026-03-08T23:32:12.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj712 testdata.475827 2026-03-08T23:32:12.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj713 testdata.475827 2026-03-08T23:32:12.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj714 testdata.475827 2026-03-08T23:32:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj715 testdata.475827 2026-03-08T23:32:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj716 testdata.475827 2026-03-08T23:32:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj717 testdata.475827 2026-03-08T23:32:12.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj718 testdata.475827 2026-03-08T23:32:12.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj719 testdata.475827 2026-03-08T23:32:12.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj720 testdata.475827 2026-03-08T23:32:12.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj721 testdata.475827 2026-03-08T23:32:12.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj722 testdata.475827 2026-03-08T23:32:12.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj723 testdata.475827 2026-03-08T23:32:12.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj724 testdata.475827 2026-03-08T23:32:12.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj725 testdata.475827 2026-03-08T23:32:12.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj726 testdata.475827 2026-03-08T23:32:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj727 testdata.475827 2026-03-08T23:32:12.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj728 testdata.475827 2026-03-08T23:32:12.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj729 testdata.475827 2026-03-08T23:32:12.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj730 testdata.475827 2026-03-08T23:32:12.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj731 testdata.475827 2026-03-08T23:32:12.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj732 testdata.475827 2026-03-08T23:32:12.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj733 testdata.475827 2026-03-08T23:32:12.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj734 testdata.475827 2026-03-08T23:32:12.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj735 testdata.475827 2026-03-08T23:32:12.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj736 testdata.475827 2026-03-08T23:32:12.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj737 testdata.475827 2026-03-08T23:32:12.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj738 testdata.475827 2026-03-08T23:32:12.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj739 testdata.475827 2026-03-08T23:32:12.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj740 testdata.475827 2026-03-08T23:32:12.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj741 testdata.475827 2026-03-08T23:32:12.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj742 testdata.475827 2026-03-08T23:32:12.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj743 testdata.475827 2026-03-08T23:32:12.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj744 testdata.475827 2026-03-08T23:32:12.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj745 testdata.475827 2026-03-08T23:32:12.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:12.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj746 testdata.475827 2026-03-08T23:32:13.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj747 testdata.475827 2026-03-08T23:32:13.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj748 testdata.475827 2026-03-08T23:32:13.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj749 testdata.475827 2026-03-08T23:32:13.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj750 testdata.475827 2026-03-08T23:32:13.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj751 testdata.475827 2026-03-08T23:32:13.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj752 testdata.475827 2026-03-08T23:32:13.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj753 testdata.475827 2026-03-08T23:32:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj754 testdata.475827 2026-03-08T23:32:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj755 testdata.475827 2026-03-08T23:32:13.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj756 testdata.475827 2026-03-08T23:32:13.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj757 testdata.475827 2026-03-08T23:32:13.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj758 testdata.475827 2026-03-08T23:32:13.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj759 testdata.475827 2026-03-08T23:32:13.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj760 testdata.475827 2026-03-08T23:32:13.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj761 testdata.475827 2026-03-08T23:32:13.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj762 testdata.475827 2026-03-08T23:32:13.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj763 testdata.475827 2026-03-08T23:32:13.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj764 testdata.475827 2026-03-08T23:32:13.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj765 testdata.475827 2026-03-08T23:32:13.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj766 testdata.475827 2026-03-08T23:32:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj767 testdata.475827 2026-03-08T23:32:13.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj768 testdata.475827 2026-03-08T23:32:13.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj769 testdata.475827 2026-03-08T23:32:13.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj770 testdata.475827 2026-03-08T23:32:13.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj771 testdata.475827 2026-03-08T23:32:13.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj772 testdata.475827 2026-03-08T23:32:13.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj773 testdata.475827 2026-03-08T23:32:13.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj774 testdata.475827 2026-03-08T23:32:13.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj775 testdata.475827 2026-03-08T23:32:13.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj776 testdata.475827 2026-03-08T23:32:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj777 testdata.475827 2026-03-08T23:32:13.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj778 testdata.475827 2026-03-08T23:32:13.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj779 testdata.475827 2026-03-08T23:32:13.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj780 testdata.475827 2026-03-08T23:32:13.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj781 testdata.475827 2026-03-08T23:32:13.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj782 testdata.475827 2026-03-08T23:32:13.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj783 testdata.475827 2026-03-08T23:32:13.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj784 testdata.475827 2026-03-08T23:32:13.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj785 testdata.475827 2026-03-08T23:32:13.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:13.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj786 testdata.475827 2026-03-08T23:32:14.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj787 testdata.475827 2026-03-08T23:32:14.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj788 testdata.475827 2026-03-08T23:32:14.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj789 testdata.475827 2026-03-08T23:32:14.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj790 testdata.475827 2026-03-08T23:32:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj791 testdata.475827 2026-03-08T23:32:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj792 testdata.475827 2026-03-08T23:32:14.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj793 testdata.475827 2026-03-08T23:32:14.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj794 testdata.475827 2026-03-08T23:32:14.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj795 testdata.475827 2026-03-08T23:32:14.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj796 testdata.475827 2026-03-08T23:32:14.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj797 testdata.475827 2026-03-08T23:32:14.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj798 testdata.475827 2026-03-08T23:32:14.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj799 testdata.475827 2026-03-08T23:32:14.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj800 testdata.475827 2026-03-08T23:32:14.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj801 testdata.475827 2026-03-08T23:32:14.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj802 testdata.475827 2026-03-08T23:32:14.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj803 testdata.475827 2026-03-08T23:32:14.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj804 testdata.475827 2026-03-08T23:32:14.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj805 testdata.475827 2026-03-08T23:32:14.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj806 testdata.475827 2026-03-08T23:32:14.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj807 testdata.475827 2026-03-08T23:32:14.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj808 testdata.475827 2026-03-08T23:32:14.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj809 testdata.475827 2026-03-08T23:32:14.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj810 testdata.475827 2026-03-08T23:32:14.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj811 testdata.475827 2026-03-08T23:32:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj812 testdata.475827 2026-03-08T23:32:14.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj813 testdata.475827 2026-03-08T23:32:14.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj814 testdata.475827 2026-03-08T23:32:14.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj815 testdata.475827 2026-03-08T23:32:14.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj816 testdata.475827 2026-03-08T23:32:14.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj817 testdata.475827 2026-03-08T23:32:14.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj818 testdata.475827 2026-03-08T23:32:14.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj819 testdata.475827 2026-03-08T23:32:14.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj820 testdata.475827 2026-03-08T23:32:14.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj821 testdata.475827 2026-03-08T23:32:14.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj822 testdata.475827 2026-03-08T23:32:14.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj823 testdata.475827 2026-03-08T23:32:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj824 testdata.475827 2026-03-08T23:32:14.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj825 testdata.475827 2026-03-08T23:32:14.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj826 testdata.475827 2026-03-08T23:32:14.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj827 testdata.475827 2026-03-08T23:32:14.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj828 testdata.475827 2026-03-08T23:32:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj829 testdata.475827 2026-03-08T23:32:14.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj830 testdata.475827 2026-03-08T23:32:14.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj831 testdata.475827 2026-03-08T23:32:14.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj832 testdata.475827 2026-03-08T23:32:14.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:14.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj833 testdata.475827 2026-03-08T23:32:15.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj834 testdata.475827 2026-03-08T23:32:15.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj835 testdata.475827 2026-03-08T23:32:15.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj836 testdata.475827 2026-03-08T23:32:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj837 testdata.475827 2026-03-08T23:32:15.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj838 testdata.475827 2026-03-08T23:32:15.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj839 testdata.475827 2026-03-08T23:32:15.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj840 testdata.475827 2026-03-08T23:32:15.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj841 testdata.475827 2026-03-08T23:32:15.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj842 testdata.475827 2026-03-08T23:32:15.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj843 testdata.475827 2026-03-08T23:32:15.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj844 testdata.475827 2026-03-08T23:32:15.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj845 testdata.475827 2026-03-08T23:32:15.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj846 testdata.475827 2026-03-08T23:32:15.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj847 testdata.475827 2026-03-08T23:32:15.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj848 testdata.475827 2026-03-08T23:32:15.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj849 testdata.475827 2026-03-08T23:32:15.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj850 testdata.475827 2026-03-08T23:32:15.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.372 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj851 testdata.475827 2026-03-08T23:32:15.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj852 testdata.475827 2026-03-08T23:32:15.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj853 testdata.475827 2026-03-08T23:32:15.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj854 testdata.475827 2026-03-08T23:32:15.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj855 testdata.475827 2026-03-08T23:32:15.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj856 testdata.475827 2026-03-08T23:32:15.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj857 testdata.475827 2026-03-08T23:32:15.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj858 testdata.475827 2026-03-08T23:32:15.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj859 testdata.475827 2026-03-08T23:32:15.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj860 testdata.475827 2026-03-08T23:32:15.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj861 testdata.475827 2026-03-08T23:32:15.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj862 testdata.475827 2026-03-08T23:32:15.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj863 testdata.475827 2026-03-08T23:32:15.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj864 testdata.475827 2026-03-08T23:32:15.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj865 testdata.475827 2026-03-08T23:32:15.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj866 testdata.475827 2026-03-08T23:32:15.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj867 testdata.475827 2026-03-08T23:32:15.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj868 testdata.475827 2026-03-08T23:32:15.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj869 testdata.475827 2026-03-08T23:32:15.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj870 testdata.475827 2026-03-08T23:32:15.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj871 testdata.475827 2026-03-08T23:32:15.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj872 testdata.475827 2026-03-08T23:32:15.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj873 testdata.475827 2026-03-08T23:32:15.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj874 testdata.475827 2026-03-08T23:32:15.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj875 testdata.475827 2026-03-08T23:32:15.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj876 testdata.475827 2026-03-08T23:32:15.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj877 testdata.475827 2026-03-08T23:32:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj878 testdata.475827 2026-03-08T23:32:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:15.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj879 testdata.475827 2026-03-08T23:32:16.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj880 testdata.475827 2026-03-08T23:32:16.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj881 testdata.475827 2026-03-08T23:32:16.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj882 testdata.475827 2026-03-08T23:32:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj883 testdata.475827 2026-03-08T23:32:16.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj884 testdata.475827 2026-03-08T23:32:16.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj885 testdata.475827 2026-03-08T23:32:16.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj886 testdata.475827 2026-03-08T23:32:16.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj887 testdata.475827 2026-03-08T23:32:16.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj888 testdata.475827 2026-03-08T23:32:16.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj889 testdata.475827 2026-03-08T23:32:16.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj890 testdata.475827 2026-03-08T23:32:16.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj891 testdata.475827 2026-03-08T23:32:16.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj892 testdata.475827 2026-03-08T23:32:16.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj893 testdata.475827 2026-03-08T23:32:16.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj894 testdata.475827 2026-03-08T23:32:16.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj895 testdata.475827 2026-03-08T23:32:16.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj896 testdata.475827 2026-03-08T23:32:16.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj897 testdata.475827 2026-03-08T23:32:16.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj898 testdata.475827 2026-03-08T23:32:16.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj899 testdata.475827 2026-03-08T23:32:16.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj900 testdata.475827 2026-03-08T23:32:16.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj901 testdata.475827 2026-03-08T23:32:16.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj902 testdata.475827 2026-03-08T23:32:16.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj903 testdata.475827 2026-03-08T23:32:16.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj904 testdata.475827 2026-03-08T23:32:16.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj905 testdata.475827 2026-03-08T23:32:16.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj906 testdata.475827 2026-03-08T23:32:16.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj907 testdata.475827 2026-03-08T23:32:16.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj908 testdata.475827 2026-03-08T23:32:16.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj909 testdata.475827 2026-03-08T23:32:16.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj910 testdata.475827 2026-03-08T23:32:16.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj911 testdata.475827 2026-03-08T23:32:16.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj912 testdata.475827 2026-03-08T23:32:16.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj913 testdata.475827 2026-03-08T23:32:16.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj914 testdata.475827 2026-03-08T23:32:16.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj915 testdata.475827 2026-03-08T23:32:16.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj916 testdata.475827 2026-03-08T23:32:16.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj917 testdata.475827 2026-03-08T23:32:16.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj918 testdata.475827 2026-03-08T23:32:16.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj919 testdata.475827 2026-03-08T23:32:16.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj920 testdata.475827 2026-03-08T23:32:16.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj921 testdata.475827 2026-03-08T23:32:16.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj922 testdata.475827 2026-03-08T23:32:16.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj923 testdata.475827 2026-03-08T23:32:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:16.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj924 testdata.475827 2026-03-08T23:32:17.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj925 testdata.475827 2026-03-08T23:32:17.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj926 testdata.475827 2026-03-08T23:32:17.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj927 testdata.475827 2026-03-08T23:32:17.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj928 testdata.475827 2026-03-08T23:32:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj929 testdata.475827 2026-03-08T23:32:17.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj930 testdata.475827 2026-03-08T23:32:17.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj931 testdata.475827 2026-03-08T23:32:17.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj932 testdata.475827 2026-03-08T23:32:17.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj933 testdata.475827 2026-03-08T23:32:17.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj934 testdata.475827 2026-03-08T23:32:17.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj935 testdata.475827 2026-03-08T23:32:17.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj936 testdata.475827 2026-03-08T23:32:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj937 testdata.475827 2026-03-08T23:32:17.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj938 testdata.475827 2026-03-08T23:32:17.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj939 testdata.475827 2026-03-08T23:32:17.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj940 testdata.475827 2026-03-08T23:32:17.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj941 testdata.475827 2026-03-08T23:32:17.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj942 testdata.475827 2026-03-08T23:32:17.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj943 testdata.475827 2026-03-08T23:32:17.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj944 testdata.475827 2026-03-08T23:32:17.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj945 testdata.475827 2026-03-08T23:32:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj946 testdata.475827 2026-03-08T23:32:17.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj947 testdata.475827 2026-03-08T23:32:17.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj948 testdata.475827 2026-03-08T23:32:17.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj949 testdata.475827 2026-03-08T23:32:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj950 testdata.475827 2026-03-08T23:32:17.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj951 testdata.475827 2026-03-08T23:32:17.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj952 testdata.475827 2026-03-08T23:32:17.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj953 testdata.475827 2026-03-08T23:32:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj954 testdata.475827 2026-03-08T23:32:17.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj955 testdata.475827 2026-03-08T23:32:17.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj956 testdata.475827 2026-03-08T23:32:17.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj957 testdata.475827 2026-03-08T23:32:17.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj958 testdata.475827 2026-03-08T23:32:17.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj959 testdata.475827 2026-03-08T23:32:17.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj960 testdata.475827 2026-03-08T23:32:17.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj961 testdata.475827 2026-03-08T23:32:17.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj962 testdata.475827 2026-03-08T23:32:17.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj963 testdata.475827 2026-03-08T23:32:17.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj964 testdata.475827 2026-03-08T23:32:17.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:17.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj965 testdata.475827 2026-03-08T23:32:18.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj966 testdata.475827 2026-03-08T23:32:18.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj967 testdata.475827 2026-03-08T23:32:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj968 testdata.475827 2026-03-08T23:32:18.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj969 testdata.475827 2026-03-08T23:32:18.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj970 testdata.475827 2026-03-08T23:32:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj971 testdata.475827 2026-03-08T23:32:18.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj972 testdata.475827 2026-03-08T23:32:18.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj973 testdata.475827 2026-03-08T23:32:18.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj974 testdata.475827 2026-03-08T23:32:18.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj975 testdata.475827 2026-03-08T23:32:18.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj976 testdata.475827 2026-03-08T23:32:18.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj977 testdata.475827 2026-03-08T23:32:18.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj978 testdata.475827 2026-03-08T23:32:18.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj979 testdata.475827 2026-03-08T23:32:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj980 testdata.475827 2026-03-08T23:32:18.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj981 testdata.475827 2026-03-08T23:32:18.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj982 testdata.475827 2026-03-08T23:32:18.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj983 testdata.475827 2026-03-08T23:32:18.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj984 testdata.475827 2026-03-08T23:32:18.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj985 testdata.475827 2026-03-08T23:32:18.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj986 testdata.475827 2026-03-08T23:32:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj987 testdata.475827 2026-03-08T23:32:18.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj988 testdata.475827 2026-03-08T23:32:18.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj989 testdata.475827 2026-03-08T23:32:18.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj990 testdata.475827 2026-03-08T23:32:18.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj991 testdata.475827 2026-03-08T23:32:18.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj992 testdata.475827 2026-03-08T23:32:18.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj993 testdata.475827 2026-03-08T23:32:18.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj994 testdata.475827 2026-03-08T23:32:18.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj995 testdata.475827 2026-03-08T23:32:18.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj996 testdata.475827 2026-03-08T23:32:18.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj997 testdata.475827 2026-03-08T23:32:18.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj998 testdata.475827 2026-03-08T23:32:18.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj999 testdata.475827 2026-03-08T23:32:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:32:18.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj1000 testdata.475827 2026-03-08T23:32:18.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:339: _scrub_abort: rm -f testdata.475827 2026-03-08T23:32:18.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:341: _scrub_abort: get_primary test obj1 2026-03-08T23:32:18.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:32:18.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:32:18.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:32:18.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:32:19.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:341: _scrub_abort: local primary=1 2026-03-08T23:32:19.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:342: _scrub_abort: local pgid=1.0 2026-03-08T23:32:19.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:344: _scrub_abort: ceph tell 1.0 schedule-deep-scrub 2026-03-08T23:32:19.192 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:32:19.193 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:32:19.193 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:32:19.193 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-22T23:30:39.201173+0000" 2026-03-08T23:32:19.193 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:32:19.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:347: _scrub_abort: set -o pipefail 2026-03-08T23:32:19.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:348: _scrub_abort: found=no 2026-03-08T23:32:19.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:349: _scrub_abort: seq 0 200 2026-03-08T23:32:19.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:349: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:19.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:351: _scrub_abort: flush_pg_stats 2026-03-08T23:32:19.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:19.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:19.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:19.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836523 2026-03-08T23:32:19.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836523 2026-03-08T23:32:19.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523' 2026-03-08T23:32:19.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:19.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672997 2026-03-08T23:32:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672997 2026-03-08T23:32:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949672997' 2026-03-08T23:32:19.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:19.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542174 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542174 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836523 1-42949672997 2-60129542174' 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836523 2026-03-08T23:32:19.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:19.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:19.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836523 2026-03-08T23:32:19.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:19.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836523 2026-03-08T23:32:19.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836523' 2026-03-08T23:32:19.611 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836523 2026-03-08T23:32:19.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:19.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836523 2026-03-08T23:32:19.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:19.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672997 2026-03-08T23:32:19.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:19.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:19.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:19.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672997 2026-03-08T23:32:19.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672997 2026-03-08T23:32:19.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672997' 2026-03-08T23:32:19.792 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672997 2026-03-08T23:32:19.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:19.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672997 -lt 42949672997 2026-03-08T23:32:19.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:19.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542174 2026-03-08T23:32:19.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:19.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:19.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542174 2026-03-08T23:32:19.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:19.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542174 2026-03-08T23:32:19.961 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542174 2026-03-08T23:32:19.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542174' 2026-03-08T23:32:19.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:20.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542173 -lt 60129542174 2026-03-08T23:32:20.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:21.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:21.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:21.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542173 -lt 60129542174 2026-03-08T23:32:21.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:22.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:32:22.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542176 -lt 60129542174 2026-03-08T23:32:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:22.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:22.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: grep '^1.0' 2026-03-08T23:32:22.629 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:22.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:354: _scrub_abort: found=yes 2026-03-08T23:32:22.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:356: _scrub_abort: break 2026-03-08T23:32:22.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:359: _scrub_abort: set +o pipefail 2026-03-08T23:32:22.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:361: _scrub_abort: test yes = no 2026-03-08T23:32:22.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:367: _scrub_abort: ceph osd set nodeep-scrub 2026-03-08T23:32:22.848 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:32:22.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:368: _scrub_abort: '[' deep-scrub = deep-scrub ']' 2026-03-08T23:32:22.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:370: _scrub_abort: ceph osd set noscrub 2026-03-08T23:32:23.054 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:32:23.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:374: _scrub_abort: set -o pipefail 2026-03-08T23:32:23.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: seq 0 200 2026-03-08T23:32:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:23.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:23.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:23.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:23.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836527 2026-03-08T23:32:23.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836527 2026-03-08T23:32:23.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527' 2026-03-08T23:32:23.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:23.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:23.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673002 2026-03-08T23:32:23.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673002 2026-03-08T23:32:23.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673002' 2026-03-08T23:32:23.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:23.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542179 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542179 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836527 1-42949673002 2-60129542179' 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836527 2026-03-08T23:32:23.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:23.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:23.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836527 2026-03-08T23:32:23.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:23.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836527 2026-03-08T23:32:23.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836527' 2026-03-08T23:32:23.489 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836527 2026-03-08T23:32:23.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:23.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836528 -lt 21474836527 2026-03-08T23:32:23.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:23.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673002 2026-03-08T23:32:23.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:23.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:23.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673002 2026-03-08T23:32:23.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:23.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673002 2026-03-08T23:32:23.660 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673002 2026-03-08T23:32:23.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673002' 2026-03-08T23:32:23.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:23.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673002 -lt 42949673002 2026-03-08T23:32:23.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:23.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542179 2026-03-08T23:32:23.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:23.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:23.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542179 2026-03-08T23:32:23.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:23.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542179 2026-03-08T23:32:23.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542179' 2026-03-08T23:32:23.829 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542179 2026-03-08T23:32:23.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:24.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542179 -lt 60129542179 2026-03-08T23:32:24.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:24.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:24.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:24.152 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:24.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:24.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:24.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:24.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:24.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:24.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836530 2026-03-08T23:32:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836530 2026-03-08T23:32:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530' 2026-03-08T23:32:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:24.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:24.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673004 2026-03-08T23:32:24.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673004 2026-03-08T23:32:24.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673004' 2026-03-08T23:32:24.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:24.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:24.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542181 2026-03-08T23:32:24.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542181 2026-03-08T23:32:24.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836530 1-42949673004 2-60129542181' 2026-03-08T23:32:24.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:24.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836530 2026-03-08T23:32:24.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:24.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:24.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836530 2026-03-08T23:32:24.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:24.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836530 2026-03-08T23:32:24.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836530' 2026-03-08T23:32:24.576 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836530 2026-03-08T23:32:24.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:24.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836528 -lt 21474836530 2026-03-08T23:32:24.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:25.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:25.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:25.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836531 -lt 21474836530 2026-03-08T23:32:25.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:25.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673004 2026-03-08T23:32:25.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:25.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:25.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673004 2026-03-08T23:32:25.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:25.923 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673004 2026-03-08T23:32:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673004 2026-03-08T23:32:25.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673004' 2026-03-08T23:32:25.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:26.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673005 -lt 42949673004 2026-03-08T23:32:26.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:26.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542181 2026-03-08T23:32:26.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:26.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:26.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542181 2026-03-08T23:32:26.097 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:26.098 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542181 2026-03-08T23:32:26.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542181 2026-03-08T23:32:26.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542181' 2026-03-08T23:32:26.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:26.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542182 -lt 60129542181 2026-03-08T23:32:26.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:26.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:26.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:26.437 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:26.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:26.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:26.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:26.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:26.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:26.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:26.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836533 2026-03-08T23:32:26.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836533 2026-03-08T23:32:26.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533' 2026-03-08T23:32:26.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:26.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:26.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673007 2026-03-08T23:32:26.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673007 2026-03-08T23:32:26.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533 1-42949673007' 2026-03-08T23:32:26.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:26.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:26.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542185 2026-03-08T23:32:26.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542185 2026-03-08T23:32:26.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533 1-42949673007 2-60129542185' 2026-03-08T23:32:26.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:26.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836533 2026-03-08T23:32:26.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:26.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:26.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836533 2026-03-08T23:32:26.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:26.874 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836533 2026-03-08T23:32:26.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836533 2026-03-08T23:32:26.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836533' 2026-03-08T23:32:26.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:27.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836531 -lt 21474836533 2026-03-08T23:32:27.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:28.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:28.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:28.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836534 -lt 21474836533 2026-03-08T23:32:28.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:28.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673007 2026-03-08T23:32:28.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:28.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:28.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673007 2026-03-08T23:32:28.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:28.220 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673007 2026-03-08T23:32:28.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673007 2026-03-08T23:32:28.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673007' 2026-03-08T23:32:28.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:28.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673008 -lt 42949673007 2026-03-08T23:32:28.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:28.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542185 2026-03-08T23:32:28.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:28.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:28.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542185 2026-03-08T23:32:28.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:28.392 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542185 2026-03-08T23:32:28.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542185 2026-03-08T23:32:28.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542185' 2026-03-08T23:32:28.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:28.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542185 -lt 60129542185 2026-03-08T23:32:28.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:28.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:28.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:28.726 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:28.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:28.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:28.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:29.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836536 2026-03-08T23:32:29.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836536 2026-03-08T23:32:29.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836536' 2026-03-08T23:32:29.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:29.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:29.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673010 2026-03-08T23:32:29.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673010 2026-03-08T23:32:29.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836536 1-42949673010' 2026-03-08T23:32:29.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:29.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542188 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542188 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836536 1-42949673010 2-60129542188' 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836536 2026-03-08T23:32:29.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:29.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:29.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836536 2026-03-08T23:32:29.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:29.169 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836536 2026-03-08T23:32:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836536 2026-03-08T23:32:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836536' 2026-03-08T23:32:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:29.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836534 -lt 21474836536 2026-03-08T23:32:29.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:30.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:30.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:30.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836537 -lt 21474836536 2026-03-08T23:32:30.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:30.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673010 2026-03-08T23:32:30.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:30.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:30.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673010 2026-03-08T23:32:30.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:30.516 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673010 2026-03-08T23:32:30.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673010 2026-03-08T23:32:30.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673010' 2026-03-08T23:32:30.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:30.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673011 -lt 42949673010 2026-03-08T23:32:30.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:30.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542188 2026-03-08T23:32:30.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:30.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:30.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542188 2026-03-08T23:32:30.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:30.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542188 2026-03-08T23:32:30.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542188' 2026-03-08T23:32:30.691 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542188 2026-03-08T23:32:30.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:30.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542188 -lt 60129542188 2026-03-08T23:32:30.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:30.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:30.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:31.024 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:31.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:31.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:31.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:31.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:31.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:31.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:31.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836539 2026-03-08T23:32:31.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836539 2026-03-08T23:32:31.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539' 2026-03-08T23:32:31.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:31.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:31.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673014 2026-03-08T23:32:31.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673014 2026-03-08T23:32:31.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539 1-42949673014' 2026-03-08T23:32:31.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:31.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542191 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542191 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539 1-42949673014 2-60129542191' 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836539 2026-03-08T23:32:31.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:31.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:31.456 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836539 2026-03-08T23:32:31.456 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:31.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836539 2026-03-08T23:32:31.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836539' 2026-03-08T23:32:31.457 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836539 2026-03-08T23:32:31.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:31.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836537 -lt 21474836539 2026-03-08T23:32:31.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:32.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:32.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:32.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836540 -lt 21474836539 2026-03-08T23:32:32.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:32.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673014 2026-03-08T23:32:32.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:32.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:32.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673014 2026-03-08T23:32:32.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:32.802 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673014 2026-03-08T23:32:32.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673014 2026-03-08T23:32:32.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673014' 2026-03-08T23:32:32.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:32.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673014 -lt 42949673014 2026-03-08T23:32:32.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:32.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542191 2026-03-08T23:32:32.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:32.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:32.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542191 2026-03-08T23:32:32.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:32.974 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542191 2026-03-08T23:32:32.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542191 2026-03-08T23:32:32.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542191' 2026-03-08T23:32:32.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:33.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542191 -lt 60129542191 2026-03-08T23:32:33.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:33.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:33.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:33.296 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:33.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:33.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:33.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:33.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:33.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:33.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:33.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836543 2026-03-08T23:32:33.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836543 2026-03-08T23:32:33.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836543' 2026-03-08T23:32:33.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:33.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:33.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673017 2026-03-08T23:32:33.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673017 2026-03-08T23:32:33.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836543 1-42949673017' 2026-03-08T23:32:33.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:33.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:33.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542194 2026-03-08T23:32:33.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542194 2026-03-08T23:32:33.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836543 1-42949673017 2-60129542194' 2026-03-08T23:32:33.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:33.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836543 2026-03-08T23:32:33.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:33.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:33.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836543 2026-03-08T23:32:33.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:33.722 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836543 2026-03-08T23:32:33.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836543 2026-03-08T23:32:33.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836543' 2026-03-08T23:32:33.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:33.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836543 -lt 21474836543 2026-03-08T23:32:33.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:33.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673017 2026-03-08T23:32:33.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:33.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:33.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673017 2026-03-08T23:32:33.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:33.899 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673017 2026-03-08T23:32:33.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673017 2026-03-08T23:32:33.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673017' 2026-03-08T23:32:33.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:34.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673016 -lt 42949673017 2026-03-08T23:32:34.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:35.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:35.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673016 -lt 42949673017 2026-03-08T23:32:35.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:36.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:32:36.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:36.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673019 -lt 42949673017 2026-03-08T23:32:36.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:36.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542194 2026-03-08T23:32:36.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:36.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:36.413 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542194 2026-03-08T23:32:36.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:36.415 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542194 2026-03-08T23:32:36.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542194 2026-03-08T23:32:36.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542194' 2026-03-08T23:32:36.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542196 -lt 60129542194 2026-03-08T23:32:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:36.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:36.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:36.731 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:36.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:36.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:36.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:36.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:36.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:36.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:36.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836547 2026-03-08T23:32:36.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836547 2026-03-08T23:32:36.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836547' 2026-03-08T23:32:36.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:36.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:37.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673021 2026-03-08T23:32:37.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673021 2026-03-08T23:32:37.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836547 1-42949673021' 2026-03-08T23:32:37.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:37.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542199 2026-03-08T23:32:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542199 2026-03-08T23:32:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836547 1-42949673021 2-60129542199' 2026-03-08T23:32:37.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:37.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836547 2026-03-08T23:32:37.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:37.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836547 2026-03-08T23:32:37.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:37.147 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836547 2026-03-08T23:32:37.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836547 2026-03-08T23:32:37.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836547' 2026-03-08T23:32:37.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:37.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836545 -lt 21474836547 2026-03-08T23:32:37.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:38.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:38.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:38.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836548 -lt 21474836547 2026-03-08T23:32:38.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:38.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673021 2026-03-08T23:32:38.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:38.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:38.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673021 2026-03-08T23:32:38.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:38.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673021 2026-03-08T23:32:38.485 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673021 2026-03-08T23:32:38.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673021' 2026-03-08T23:32:38.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:38.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673022 -lt 42949673021 2026-03-08T23:32:38.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:38.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542199 2026-03-08T23:32:38.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:38.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:38.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542199 2026-03-08T23:32:38.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:38.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542199 2026-03-08T23:32:38.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542199' 2026-03-08T23:32:38.661 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542199 2026-03-08T23:32:38.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:38.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542199 -lt 60129542199 2026-03-08T23:32:38.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:38.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:38.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:38.975 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:38.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:38.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:38.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:38.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:38.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:39.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:39.152 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:39.152 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:39.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:39.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:39.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:39.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836550 2026-03-08T23:32:39.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836550 2026-03-08T23:32:39.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550' 2026-03-08T23:32:39.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:39.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:39.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673025 2026-03-08T23:32:39.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673025 2026-03-08T23:32:39.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550 1-42949673025' 2026-03-08T23:32:39.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:39.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542202 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542202 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550 1-42949673025 2-60129542202' 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836550 2026-03-08T23:32:39.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:39.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:39.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836550 2026-03-08T23:32:39.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:39.382 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836550 2026-03-08T23:32:39.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836550 2026-03-08T23:32:39.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836550' 2026-03-08T23:32:39.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:39.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836548 -lt 21474836550 2026-03-08T23:32:39.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:40.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:40.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:40.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836551 -lt 21474836550 2026-03-08T23:32:40.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:40.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673025 2026-03-08T23:32:40.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:40.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:40.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673025 2026-03-08T23:32:40.723 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:40.725 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673025 2026-03-08T23:32:40.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673025 2026-03-08T23:32:40.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673025' 2026-03-08T23:32:40.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:40.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673025 -lt 42949673025 2026-03-08T23:32:40.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:40.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542202 2026-03-08T23:32:40.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:40.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:40.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542202 2026-03-08T23:32:40.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:40.891 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542202 2026-03-08T23:32:40.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542202 2026-03-08T23:32:40.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542202' 2026-03-08T23:32:40.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:41.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542202 -lt 60129542202 2026-03-08T23:32:41.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:41.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:41.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:41.205 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:41.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:41.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:41.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:41.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:41.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:41.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:41.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836554 2026-03-08T23:32:41.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836554 2026-03-08T23:32:41.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836554' 2026-03-08T23:32:41.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:41.456 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673028 2026-03-08T23:32:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673028 2026-03-08T23:32:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836554 1-42949673028' 2026-03-08T23:32:41.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:41.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:41.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542205 2026-03-08T23:32:41.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542205 2026-03-08T23:32:41.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836554 1-42949673028 2-60129542205' 2026-03-08T23:32:41.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:41.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836554 2026-03-08T23:32:41.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:41.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:41.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836554 2026-03-08T23:32:41.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:41.611 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836554 2026-03-08T23:32:41.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836554 2026-03-08T23:32:41.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836554' 2026-03-08T23:32:41.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:41.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836554 -lt 21474836554 2026-03-08T23:32:41.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:41.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673028 2026-03-08T23:32:41.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:41.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:41.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673028 2026-03-08T23:32:41.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:41.774 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673028 2026-03-08T23:32:41.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673028 2026-03-08T23:32:41.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673028' 2026-03-08T23:32:41.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:41.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673028 -lt 42949673028 2026-03-08T23:32:41.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:41.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542205 2026-03-08T23:32:41.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:41.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:41.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542205 2026-03-08T23:32:41.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:41.939 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542205 2026-03-08T23:32:41.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542205 2026-03-08T23:32:41.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542205' 2026-03-08T23:32:41.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:42.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542204 -lt 60129542205 2026-03-08T23:32:42.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:43.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:43.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:43.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542204 -lt 60129542205 2026-03-08T23:32:43.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:44.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:32:44.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:44.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542207 -lt 60129542205 2026-03-08T23:32:44.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:44.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:44.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:44.578 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:44.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:44.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:44.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:44.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:44.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:44.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:44.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836558 2026-03-08T23:32:44.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836558 2026-03-08T23:32:44.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836558' 2026-03-08T23:32:44.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:44.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:44.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673032 2026-03-08T23:32:44.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673032 2026-03-08T23:32:44.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836558 1-42949673032' 2026-03-08T23:32:44.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:44.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542210 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542210 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836558 1-42949673032 2-60129542210' 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836558 2026-03-08T23:32:44.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:44.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:44.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836558 2026-03-08T23:32:44.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:44.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836558 2026-03-08T23:32:44.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836558' 2026-03-08T23:32:44.993 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836558 2026-03-08T23:32:44.993 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:45.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836556 -lt 21474836558 2026-03-08T23:32:45.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:46.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:46.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:46.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836559 -lt 21474836558 2026-03-08T23:32:46.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:46.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673032 2026-03-08T23:32:46.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:46.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:46.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673032 2026-03-08T23:32:46.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:46.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673032 2026-03-08T23:32:46.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673032' 2026-03-08T23:32:46.331 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673032 2026-03-08T23:32:46.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:46.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673033 -lt 42949673032 2026-03-08T23:32:46.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:46.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542210 2026-03-08T23:32:46.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:46.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:46.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542210 2026-03-08T23:32:46.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:46.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542210 2026-03-08T23:32:46.493 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542210 2026-03-08T23:32:46.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542210' 2026-03-08T23:32:46.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:46.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542210 -lt 60129542210 2026-03-08T23:32:46.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:46.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:46.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:46.804 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:46.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:46.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:46.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:46.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:46.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:46.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:47.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836561 2026-03-08T23:32:47.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836561 2026-03-08T23:32:47.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836561' 2026-03-08T23:32:47.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:47.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:47.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673035 2026-03-08T23:32:47.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673035 2026-03-08T23:32:47.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836561 1-42949673035' 2026-03-08T23:32:47.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:47.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:47.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542213 2026-03-08T23:32:47.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542213 2026-03-08T23:32:47.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836561 1-42949673035 2-60129542213' 2026-03-08T23:32:47.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:47.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836561 2026-03-08T23:32:47.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:47.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:47.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836561 2026-03-08T23:32:47.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:47.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836561 2026-03-08T23:32:47.225 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836561 2026-03-08T23:32:47.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836561' 2026-03-08T23:32:47.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:47.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836559 -lt 21474836561 2026-03-08T23:32:47.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:48.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:48.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:48.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836562 -lt 21474836561 2026-03-08T23:32:48.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:48.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673035 2026-03-08T23:32:48.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:48.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:48.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673035 2026-03-08T23:32:48.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:48.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673035 2026-03-08T23:32:48.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673035' 2026-03-08T23:32:48.568 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673035 2026-03-08T23:32:48.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:48.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673036 -lt 42949673035 2026-03-08T23:32:48.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:48.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542213 2026-03-08T23:32:48.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:48.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:48.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542213 2026-03-08T23:32:48.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:48.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542213 2026-03-08T23:32:48.739 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542213 2026-03-08T23:32:48.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542213' 2026-03-08T23:32:48.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:48.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542213 -lt 60129542213 2026-03-08T23:32:48.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:48.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:48.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:49.075 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:49.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:49.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:49.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:49.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:49.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:49.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:49.264 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:49.264 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:49.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:49.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:49.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:49.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836564 2026-03-08T23:32:49.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836564 2026-03-08T23:32:49.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836564' 2026-03-08T23:32:49.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:49.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:49.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673039 2026-03-08T23:32:49.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673039 2026-03-08T23:32:49.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836564 1-42949673039' 2026-03-08T23:32:49.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:49.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542216 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542216 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836564 1-42949673039 2-60129542216' 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836564 2026-03-08T23:32:49.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:49.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:49.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836564 2026-03-08T23:32:49.499 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:49.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836564 2026-03-08T23:32:49.499 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836564 2026-03-08T23:32:49.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836564' 2026-03-08T23:32:49.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:49.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836565 -lt 21474836564 2026-03-08T23:32:49.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:49.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673039 2026-03-08T23:32:49.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:49.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:49.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673039 2026-03-08T23:32:49.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:49.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673039 2026-03-08T23:32:49.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673039' 2026-03-08T23:32:49.669 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673039 2026-03-08T23:32:49.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:49.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673039 -lt 42949673039 2026-03-08T23:32:49.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:49.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542216 2026-03-08T23:32:49.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:49.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:49.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542216 2026-03-08T23:32:49.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:49.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542216 2026-03-08T23:32:49.843 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542216 2026-03-08T23:32:49.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542216' 2026-03-08T23:32:49.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:50.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542216 -lt 60129542216 2026-03-08T23:32:50.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:50.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:50.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:50.155 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:50.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:50.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:50.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:50.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:50.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:50.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:50.346 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:50.346 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:50.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:50.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:50.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836567 2026-03-08T23:32:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836567 2026-03-08T23:32:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567' 2026-03-08T23:32:50.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:50.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:50.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673041 2026-03-08T23:32:50.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673041 2026-03-08T23:32:50.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567 1-42949673041' 2026-03-08T23:32:50.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:50.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542218 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542218 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567 1-42949673041 2-60129542218' 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836567 2026-03-08T23:32:50.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:50.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:50.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836567 2026-03-08T23:32:50.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:50.596 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836567 2026-03-08T23:32:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836567 2026-03-08T23:32:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836567' 2026-03-08T23:32:50.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:50.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836565 -lt 21474836567 2026-03-08T23:32:50.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:51.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:51.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:51.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836568 -lt 21474836567 2026-03-08T23:32:51.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:51.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673041 2026-03-08T23:32:51.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:51.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:51.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673041 2026-03-08T23:32:51.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:51.943 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673041 2026-03-08T23:32:51.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673041 2026-03-08T23:32:51.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673041' 2026-03-08T23:32:51.943 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:52.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673042 -lt 42949673041 2026-03-08T23:32:52.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:52.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542218 2026-03-08T23:32:52.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:52.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542218 2026-03-08T23:32:52.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:52.120 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542218 2026-03-08T23:32:52.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542218 2026-03-08T23:32:52.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542218' 2026-03-08T23:32:52.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:52.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542219 -lt 60129542218 2026-03-08T23:32:52.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:52.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:52.298 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:52.463 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:52.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:52.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:52.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:52.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:52.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:52.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836570 2026-03-08T23:32:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836570 2026-03-08T23:32:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570' 2026-03-08T23:32:52.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:52.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:52.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673044 2026-03-08T23:32:52.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673044 2026-03-08T23:32:52.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570 1-42949673044' 2026-03-08T23:32:52.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:52.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:52.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542222 2026-03-08T23:32:52.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542222 2026-03-08T23:32:52.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570 1-42949673044 2-60129542222' 2026-03-08T23:32:52.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:52.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836570 2026-03-08T23:32:52.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:52.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:52.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836570 2026-03-08T23:32:52.893 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:52.894 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836570 2026-03-08T23:32:52.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836570 2026-03-08T23:32:52.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836570' 2026-03-08T23:32:52.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:53.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836568 -lt 21474836570 2026-03-08T23:32:53.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:54.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:54.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:54.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836571 -lt 21474836570 2026-03-08T23:32:54.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:54.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673044 2026-03-08T23:32:54.241 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:54.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:54.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673044 2026-03-08T23:32:54.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:54.243 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673044 2026-03-08T23:32:54.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673044 2026-03-08T23:32:54.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673044' 2026-03-08T23:32:54.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:54.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673045 -lt 42949673044 2026-03-08T23:32:54.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:54.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542222 2026-03-08T23:32:54.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:54.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:54.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542222 2026-03-08T23:32:54.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:54.412 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542222 2026-03-08T23:32:54.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542222 2026-03-08T23:32:54.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542222' 2026-03-08T23:32:54.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:54.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542222 -lt 60129542222 2026-03-08T23:32:54.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:54.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:54.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:54.778 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:54.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:54.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:54.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:54.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:54.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:54.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:55.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836573 2026-03-08T23:32:55.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836573 2026-03-08T23:32:55.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836573' 2026-03-08T23:32:55.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:55.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673047 2026-03-08T23:32:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673047 2026-03-08T23:32:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836573 1-42949673047' 2026-03-08T23:32:55.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:55.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542225 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542225 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836573 1-42949673047 2-60129542225' 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836573 2026-03-08T23:32:55.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:55.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:55.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836573 2026-03-08T23:32:55.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:55.200 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836573 2026-03-08T23:32:55.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836573 2026-03-08T23:32:55.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836573' 2026-03-08T23:32:55.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:55.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836571 -lt 21474836573 2026-03-08T23:32:55.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:56.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:56.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:56.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836574 -lt 21474836573 2026-03-08T23:32:56.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673047 2026-03-08T23:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:56.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:56.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673047 2026-03-08T23:32:56.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:56.546 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673047 2026-03-08T23:32:56.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673047 2026-03-08T23:32:56.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673047' 2026-03-08T23:32:56.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:56.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673048 -lt 42949673047 2026-03-08T23:32:56.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:56.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542225 2026-03-08T23:32:56.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:56.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:56.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542225 2026-03-08T23:32:56.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:56.725 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542225 2026-03-08T23:32:56.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542225 2026-03-08T23:32:56.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542225' 2026-03-08T23:32:56.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:56.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542225 -lt 60129542225 2026-03-08T23:32:56.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:56.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:56.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:57.054 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:57.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:57.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:57.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:57.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836576 2026-03-08T23:32:57.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836576 2026-03-08T23:32:57.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836576' 2026-03-08T23:32:57.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:57.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:57.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673051 2026-03-08T23:32:57.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673051 2026-03-08T23:32:57.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836576 1-42949673051' 2026-03-08T23:32:57.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:57.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542228 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542228 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836576 1-42949673051 2-60129542228' 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836576 2026-03-08T23:32:57.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:57.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:57.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836576 2026-03-08T23:32:57.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:57.482 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836576 2026-03-08T23:32:57.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836576 2026-03-08T23:32:57.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836576' 2026-03-08T23:32:57.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:57.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836574 -lt 21474836576 2026-03-08T23:32:57.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:32:58.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:32:58.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:32:58.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836577 -lt 21474836576 2026-03-08T23:32:58.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:58.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673051 2026-03-08T23:32:58.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:58.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:32:58.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673051 2026-03-08T23:32:58.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:58.826 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673051 2026-03-08T23:32:58.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673051 2026-03-08T23:32:58.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673051' 2026-03-08T23:32:58.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:32:59.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673051 -lt 42949673051 2026-03-08T23:32:59.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:59.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542228 2026-03-08T23:32:59.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:59.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:32:59.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542228 2026-03-08T23:32:59.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:59.006 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542228 2026-03-08T23:32:59.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542228 2026-03-08T23:32:59.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542228' 2026-03-08T23:32:59.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:32:59.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542228 -lt 60129542228 2026-03-08T23:32:59.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:32:59.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:32:59.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:32:59.339 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:32:59.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:32:59.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:32:59.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:32:59.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:32:59.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:59.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:32:59.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836580 2026-03-08T23:32:59.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836580 2026-03-08T23:32:59.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836580' 2026-03-08T23:32:59.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:59.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:32:59.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673054 2026-03-08T23:32:59.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673054 2026-03-08T23:32:59.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836580 1-42949673054' 2026-03-08T23:32:59.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:32:59.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:32:59.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542231 2026-03-08T23:32:59.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542231 2026-03-08T23:32:59.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836580 1-42949673054 2-60129542231' 2026-03-08T23:32:59.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:32:59.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836580 2026-03-08T23:32:59.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:32:59.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:32:59.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836580 2026-03-08T23:32:59.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:32:59.819 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836580 2026-03-08T23:32:59.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836580 2026-03-08T23:32:59.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836580' 2026-03-08T23:32:59.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:00.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836579 -lt 21474836580 2026-03-08T23:33:00.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:00.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:00.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:01.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836579 -lt 21474836580 2026-03-08T23:33:01.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:02.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:33:02.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:02.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836582 -lt 21474836580 2026-03-08T23:33:02.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:02.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673054 2026-03-08T23:33:02.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:02.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:02.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673054 2026-03-08T23:33:02.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:02.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673054 2026-03-08T23:33:02.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673054' 2026-03-08T23:33:02.336 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673054 2026-03-08T23:33:02.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:02.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673056 -lt 42949673054 2026-03-08T23:33:02.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:02.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542231 2026-03-08T23:33:02.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:02.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:02.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542231 2026-03-08T23:33:02.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:02.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542231 2026-03-08T23:33:02.508 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542231 2026-03-08T23:33:02.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542231' 2026-03-08T23:33:02.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:02.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542233 -lt 60129542231 2026-03-08T23:33:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:02.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:02.866 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:02.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:02.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:02.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:02.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:02.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:03.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836584 2026-03-08T23:33:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836584 2026-03-08T23:33:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836584' 2026-03-08T23:33:03.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:03.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673058 2026-03-08T23:33:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673058 2026-03-08T23:33:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836584 1-42949673058' 2026-03-08T23:33:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:03.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542236 2026-03-08T23:33:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542236 2026-03-08T23:33:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836584 1-42949673058 2-60129542236' 2026-03-08T23:33:03.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:03.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836584 2026-03-08T23:33:03.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:03.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836584 2026-03-08T23:33:03.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:03.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836584 2026-03-08T23:33:03.325 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836584 2026-03-08T23:33:03.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836584' 2026-03-08T23:33:03.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:03.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836582 -lt 21474836584 2026-03-08T23:33:03.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:04.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:04.506 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:04.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836585 -lt 21474836584 2026-03-08T23:33:04.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:04.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673058 2026-03-08T23:33:04.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:04.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:04.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673058 2026-03-08T23:33:04.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:04.683 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673058 2026-03-08T23:33:04.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673058 2026-03-08T23:33:04.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673058' 2026-03-08T23:33:04.683 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:04.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673059 -lt 42949673058 2026-03-08T23:33:04.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:04.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542236 2026-03-08T23:33:04.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:04.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:04.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542236 2026-03-08T23:33:04.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:04.862 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542236 2026-03-08T23:33:04.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542236 2026-03-08T23:33:04.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542236' 2026-03-08T23:33:04.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:05.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542236 -lt 60129542236 2026-03-08T23:33:05.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:05.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:05.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:05.235 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:05.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:05.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:05.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:05.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:05.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:05.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:05.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836588 2026-03-08T23:33:05.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836588 2026-03-08T23:33:05.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836588' 2026-03-08T23:33:05.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:05.508 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:05.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673062 2026-03-08T23:33:05.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673062 2026-03-08T23:33:05.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836588 1-42949673062' 2026-03-08T23:33:05.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:05.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542239 2026-03-08T23:33:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542239 2026-03-08T23:33:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836588 1-42949673062 2-60129542239' 2026-03-08T23:33:05.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:05.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836588 2026-03-08T23:33:05.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:05.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:05.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836588 2026-03-08T23:33:05.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:05.772 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836588 2026-03-08T23:33:05.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836588 2026-03-08T23:33:05.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836588' 2026-03-08T23:33:05.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:05.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836588 -lt 21474836588 2026-03-08T23:33:05.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:05.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673062 2026-03-08T23:33:05.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:05.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:05.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673062 2026-03-08T23:33:05.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:05.953 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673062 2026-03-08T23:33:05.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673062 2026-03-08T23:33:05.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673062' 2026-03-08T23:33:05.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:06.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673061 -lt 42949673062 2026-03-08T23:33:06.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:07.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:07.127 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:07.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673061 -lt 42949673062 2026-03-08T23:33:07.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:08.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:33:08.302 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673064 -lt 42949673062 2026-03-08T23:33:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542239 2026-03-08T23:33:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:08.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:08.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542239 2026-03-08T23:33:08.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:08.473 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542239 2026-03-08T23:33:08.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542239 2026-03-08T23:33:08.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542239' 2026-03-08T23:33:08.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542241 -lt 60129542239 2026-03-08T23:33:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:08.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:08.806 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:08.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:08.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:08.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:08.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:08.820 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:08.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:09.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836592 2026-03-08T23:33:09.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836592 2026-03-08T23:33:09.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592' 2026-03-08T23:33:09.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:09.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:09.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673066 2026-03-08T23:33:09.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673066 2026-03-08T23:33:09.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592 1-42949673066' 2026-03-08T23:33:09.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:09.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542244 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542244 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592 1-42949673066 2-60129542244' 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836592 2026-03-08T23:33:09.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:09.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:09.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836592 2026-03-08T23:33:09.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:09.230 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836592 2026-03-08T23:33:09.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836592 2026-03-08T23:33:09.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836592' 2026-03-08T23:33:09.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:09.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836590 -lt 21474836592 2026-03-08T23:33:09.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:10.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:10.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836593 -lt 21474836592 2026-03-08T23:33:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:10.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673066 2026-03-08T23:33:10.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:10.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:10.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673066 2026-03-08T23:33:10.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:10.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673066 2026-03-08T23:33:10.577 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673066 2026-03-08T23:33:10.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673066' 2026-03-08T23:33:10.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:10.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673067 -lt 42949673066 2026-03-08T23:33:10.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:10.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542244 2026-03-08T23:33:10.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:10.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:10.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542244 2026-03-08T23:33:10.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:10.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542244 2026-03-08T23:33:10.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542244' 2026-03-08T23:33:10.757 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542244 2026-03-08T23:33:10.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:10.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542244 -lt 60129542244 2026-03-08T23:33:10.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:10.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:10.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:11.089 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:11.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:11.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:11.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:11.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:11.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:11.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:11.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836595 2026-03-08T23:33:11.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836595 2026-03-08T23:33:11.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836595' 2026-03-08T23:33:11.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:11.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:11.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673070 2026-03-08T23:33:11.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673070 2026-03-08T23:33:11.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836595 1-42949673070' 2026-03-08T23:33:11.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:11.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542247 2026-03-08T23:33:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542247 2026-03-08T23:33:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836595 1-42949673070 2-60129542247' 2026-03-08T23:33:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:11.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836595 2026-03-08T23:33:11.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:11.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:11.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836595 2026-03-08T23:33:11.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836595 2026-03-08T23:33:11.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836595' 2026-03-08T23:33:11.533 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836595 2026-03-08T23:33:11.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:11.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836596 -lt 21474836595 2026-03-08T23:33:11.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:11.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673070 2026-03-08T23:33:11.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:11.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:11.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673070 2026-03-08T23:33:11.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:11.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673070 2026-03-08T23:33:11.716 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673070 2026-03-08T23:33:11.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673070' 2026-03-08T23:33:11.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:11.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673070 -lt 42949673070 2026-03-08T23:33:11.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:11.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542247 2026-03-08T23:33:11.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:11.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:11.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542247 2026-03-08T23:33:11.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:11.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542247 2026-03-08T23:33:11.898 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542247 2026-03-08T23:33:11.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542247' 2026-03-08T23:33:11.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:12.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542247 -lt 60129542247 2026-03-08T23:33:12.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:12.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:12.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:12.237 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:12.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:12.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:12.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:12.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:12.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:12.429 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:12.429 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:12.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:12.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:12.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:12.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836598 2026-03-08T23:33:12.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836598 2026-03-08T23:33:12.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836598' 2026-03-08T23:33:12.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:12.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:12.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673072 2026-03-08T23:33:12.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673072 2026-03-08T23:33:12.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836598 1-42949673072' 2026-03-08T23:33:12.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:12.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:12.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542249 2026-03-08T23:33:12.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542249 2026-03-08T23:33:12.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836598 1-42949673072 2-60129542249' 2026-03-08T23:33:12.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:12.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836598 2026-03-08T23:33:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:12.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:12.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836598 2026-03-08T23:33:12.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:12.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836598 2026-03-08T23:33:12.690 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836598 2026-03-08T23:33:12.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836598' 2026-03-08T23:33:12.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:12.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836596 -lt 21474836598 2026-03-08T23:33:12.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:13.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:13.865 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:14.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836599 -lt 21474836598 2026-03-08T23:33:14.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:14.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673072 2026-03-08T23:33:14.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:14.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:14.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673072 2026-03-08T23:33:14.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:14.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673072 2026-03-08T23:33:14.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673072' 2026-03-08T23:33:14.045 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673072 2026-03-08T23:33:14.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:14.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673073 -lt 42949673072 2026-03-08T23:33:14.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:14.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542249 2026-03-08T23:33:14.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:14.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:14.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542249 2026-03-08T23:33:14.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:14.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542249 2026-03-08T23:33:14.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542249' 2026-03-08T23:33:14.231 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542249 2026-03-08T23:33:14.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:14.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542250 -lt 60129542249 2026-03-08T23:33:14.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:14.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:14.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:14.558 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:14.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:14.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836601 2026-03-08T23:33:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836601 2026-03-08T23:33:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836601' 2026-03-08T23:33:14.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:14.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673075 2026-03-08T23:33:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673075 2026-03-08T23:33:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836601 1-42949673075' 2026-03-08T23:33:14.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:14.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:14.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542253 2026-03-08T23:33:14.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542253 2026-03-08T23:33:14.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836601 1-42949673075 2-60129542253' 2026-03-08T23:33:14.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:14.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836601 2026-03-08T23:33:14.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:14.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:14.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836601 2026-03-08T23:33:14.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:14.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836601 2026-03-08T23:33:14.978 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836601 2026-03-08T23:33:14.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836601' 2026-03-08T23:33:14.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:15.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836599 -lt 21474836601 2026-03-08T23:33:15.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:16.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:16.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:16.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836602 -lt 21474836601 2026-03-08T23:33:16.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:16.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673075 2026-03-08T23:33:16.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:16.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:16.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673075 2026-03-08T23:33:16.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:16.335 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673075 2026-03-08T23:33:16.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673075 2026-03-08T23:33:16.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673075' 2026-03-08T23:33:16.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:16.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673076 -lt 42949673075 2026-03-08T23:33:16.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:16.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542253 2026-03-08T23:33:16.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:16.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:16.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542253 2026-03-08T23:33:16.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:16.505 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542253 2026-03-08T23:33:16.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542253 2026-03-08T23:33:16.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542253' 2026-03-08T23:33:16.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:16.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542253 -lt 60129542253 2026-03-08T23:33:16.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:16.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:16.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:16.825 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:16.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:16.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:16.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:16.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:16.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:17.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:17.011 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:17.011 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:17.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:17.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:17.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836604 2026-03-08T23:33:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836604 2026-03-08T23:33:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836604' 2026-03-08T23:33:17.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:17.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673078 2026-03-08T23:33:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673078 2026-03-08T23:33:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836604 1-42949673078' 2026-03-08T23:33:17.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:17.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542256 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542256 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836604 1-42949673078 2-60129542256' 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836604 2026-03-08T23:33:17.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:17.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:17.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836604 2026-03-08T23:33:17.260 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:17.261 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836604 2026-03-08T23:33:17.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836604 2026-03-08T23:33:17.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836604' 2026-03-08T23:33:17.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:17.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836602 -lt 21474836604 2026-03-08T23:33:17.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:18.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:18.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:18.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836605 -lt 21474836604 2026-03-08T23:33:18.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:18.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673078 2026-03-08T23:33:18.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:18.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:18.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673078 2026-03-08T23:33:18.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:18.609 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673078 2026-03-08T23:33:18.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673078 2026-03-08T23:33:18.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673078' 2026-03-08T23:33:18.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:18.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673079 -lt 42949673078 2026-03-08T23:33:18.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:18.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542256 2026-03-08T23:33:18.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:18.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:18.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542256 2026-03-08T23:33:18.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:18.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542256 2026-03-08T23:33:18.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542256' 2026-03-08T23:33:18.779 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542256 2026-03-08T23:33:18.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:18.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542256 -lt 60129542256 2026-03-08T23:33:18.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:18.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:18.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:19.095 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:19.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:19.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:19.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:19.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:19.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:19.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:19.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836607 2026-03-08T23:33:19.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836607 2026-03-08T23:33:19.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836607' 2026-03-08T23:33:19.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:19.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673082 2026-03-08T23:33:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673082 2026-03-08T23:33:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836607 1-42949673082' 2026-03-08T23:33:19.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:19.439 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542259 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542259 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836607 1-42949673082 2-60129542259' 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836607 2026-03-08T23:33:19.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:19.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:19.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836607 2026-03-08T23:33:19.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:19.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836607 2026-03-08T23:33:19.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836607' 2026-03-08T23:33:19.516 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836607 2026-03-08T23:33:19.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836608 -lt 21474836607 2026-03-08T23:33:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:19.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673082 2026-03-08T23:33:19.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:19.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:19.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673082 2026-03-08T23:33:19.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:19.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673082 2026-03-08T23:33:19.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673082' 2026-03-08T23:33:19.679 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673082 2026-03-08T23:33:19.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:19.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673082 -lt 42949673082 2026-03-08T23:33:19.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:19.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542259 2026-03-08T23:33:19.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:19.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:19.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542259 2026-03-08T23:33:19.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:19.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542259 2026-03-08T23:33:19.844 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542259 2026-03-08T23:33:19.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542259' 2026-03-08T23:33:19.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542259 -lt 60129542259 2026-03-08T23:33:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:20.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:20.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:20.150 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:20.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:20.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:20.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:20.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:20.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:20.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:20.330 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:20.330 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:20.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:20.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:20.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:20.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836610 2026-03-08T23:33:20.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836610 2026-03-08T23:33:20.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836610' 2026-03-08T23:33:20.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:20.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:20.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673084 2026-03-08T23:33:20.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673084 2026-03-08T23:33:20.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836610 1-42949673084' 2026-03-08T23:33:20.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:20.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542261 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542261 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836610 1-42949673084 2-60129542261' 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836610 2026-03-08T23:33:20.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:20.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:20.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836610 2026-03-08T23:33:20.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:20.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836610 2026-03-08T23:33:20.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836610' 2026-03-08T23:33:20.580 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836610 2026-03-08T23:33:20.580 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:20.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836608 -lt 21474836610 2026-03-08T23:33:20.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:21.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:21.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:21.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836611 -lt 21474836610 2026-03-08T23:33:21.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:21.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673084 2026-03-08T23:33:21.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:21.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:21.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673084 2026-03-08T23:33:21.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:21.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673084 2026-03-08T23:33:21.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673084' 2026-03-08T23:33:21.913 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673084 2026-03-08T23:33:21.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:22.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673085 -lt 42949673084 2026-03-08T23:33:22.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:22.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542261 2026-03-08T23:33:22.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:22.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:22.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542261 2026-03-08T23:33:22.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:22.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542261 2026-03-08T23:33:22.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542261' 2026-03-08T23:33:22.080 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542261 2026-03-08T23:33:22.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542262 -lt 60129542261 2026-03-08T23:33:22.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:22.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:22.390 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:22.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:22.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:22.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:22.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:22.403 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:22.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:22.573 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:22.573 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:22.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:22.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:22.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836613 2026-03-08T23:33:22.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836613 2026-03-08T23:33:22.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836613' 2026-03-08T23:33:22.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:22.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:22.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673087 2026-03-08T23:33:22.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673087 2026-03-08T23:33:22.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836613 1-42949673087' 2026-03-08T23:33:22.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:22.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542264 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542264 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836613 1-42949673087 2-60129542264' 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836613 2026-03-08T23:33:22.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:22.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:22.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836613 2026-03-08T23:33:22.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:22.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836613 2026-03-08T23:33:22.808 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836613 2026-03-08T23:33:22.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836613' 2026-03-08T23:33:22.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:22.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836611 -lt 21474836613 2026-03-08T23:33:22.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:23.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:23.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:24.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836614 -lt 21474836613 2026-03-08T23:33:24.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:24.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673087 2026-03-08T23:33:24.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:24.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:24.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673087 2026-03-08T23:33:24.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:24.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673087 2026-03-08T23:33:24.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673087' 2026-03-08T23:33:24.133 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673087 2026-03-08T23:33:24.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:24.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673088 -lt 42949673087 2026-03-08T23:33:24.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:24.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542264 2026-03-08T23:33:24.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:24.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:24.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542264 2026-03-08T23:33:24.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:24.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542264 2026-03-08T23:33:24.295 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542264 2026-03-08T23:33:24.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542264' 2026-03-08T23:33:24.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:24.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542265 -lt 60129542264 2026-03-08T23:33:24.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:24.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:24.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:24.600 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:24.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:24.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:24.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:24.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:24.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:24.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:24.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836616 2026-03-08T23:33:24.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836616 2026-03-08T23:33:24.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836616' 2026-03-08T23:33:24.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:24.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:24.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673090 2026-03-08T23:33:24.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673090 2026-03-08T23:33:24.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836616 1-42949673090' 2026-03-08T23:33:24.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:24.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:24.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542268 2026-03-08T23:33:24.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542268 2026-03-08T23:33:24.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836616 1-42949673090 2-60129542268' 2026-03-08T23:33:24.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:24.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836616 2026-03-08T23:33:24.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:24.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:24.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836616 2026-03-08T23:33:24.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836616 2026-03-08T23:33:24.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836616' 2026-03-08T23:33:24.999 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836616 2026-03-08T23:33:24.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:25.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836614 -lt 21474836616 2026-03-08T23:33:25.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:26.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:26.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:26.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836617 -lt 21474836616 2026-03-08T23:33:26.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673090 2026-03-08T23:33:26.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:26.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:26.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673090 2026-03-08T23:33:26.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:26.318 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673090 2026-03-08T23:33:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673090 2026-03-08T23:33:26.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673090' 2026-03-08T23:33:26.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:26.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673091 -lt 42949673090 2026-03-08T23:33:26.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:26.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542268 2026-03-08T23:33:26.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:26.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:26.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542268 2026-03-08T23:33:26.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:26.481 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542268 2026-03-08T23:33:26.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542268 2026-03-08T23:33:26.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542268' 2026-03-08T23:33:26.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:26.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542268 -lt 60129542268 2026-03-08T23:33:26.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:26.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:26.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:26.785 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:26.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:26.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:26.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:26.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:26.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:26.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:27.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836619 2026-03-08T23:33:27.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836619 2026-03-08T23:33:27.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619' 2026-03-08T23:33:27.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:27.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:27.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673093 2026-03-08T23:33:27.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673093 2026-03-08T23:33:27.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619 1-42949673093' 2026-03-08T23:33:27.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:27.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:27.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542271 2026-03-08T23:33:27.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542271 2026-03-08T23:33:27.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619 1-42949673093 2-60129542271' 2026-03-08T23:33:27.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:27.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836619 2026-03-08T23:33:27.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:27.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:27.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836619 2026-03-08T23:33:27.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:27.197 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836619 2026-03-08T23:33:27.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836619 2026-03-08T23:33:27.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836619' 2026-03-08T23:33:27.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:27.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836617 -lt 21474836619 2026-03-08T23:33:27.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:28.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:28.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:28.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836620 -lt 21474836619 2026-03-08T23:33:28.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:28.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673093 2026-03-08T23:33:28.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:28.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:28.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673093 2026-03-08T23:33:28.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:28.520 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673093 2026-03-08T23:33:28.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673093 2026-03-08T23:33:28.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673093' 2026-03-08T23:33:28.520 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:28.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673094 -lt 42949673093 2026-03-08T23:33:28.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:28.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542271 2026-03-08T23:33:28.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:28.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:28.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542271 2026-03-08T23:33:28.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:28.683 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542271 2026-03-08T23:33:28.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542271 2026-03-08T23:33:28.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542271' 2026-03-08T23:33:28.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:28.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542271 -lt 60129542271 2026-03-08T23:33:28.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:28.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:28.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:28.999 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:29.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:29.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:29.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:29.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:29.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:29.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:29.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836622 2026-03-08T23:33:29.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836622 2026-03-08T23:33:29.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622' 2026-03-08T23:33:29.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:29.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:29.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673097 2026-03-08T23:33:29.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673097 2026-03-08T23:33:29.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622 1-42949673097' 2026-03-08T23:33:29.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:29.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542274 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542274 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622 1-42949673097 2-60129542274' 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836622 2026-03-08T23:33:29.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:29.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:29.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836622 2026-03-08T23:33:29.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836622 2026-03-08T23:33:29.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836622' 2026-03-08T23:33:29.399 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836622 2026-03-08T23:33:29.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836620 -lt 21474836622 2026-03-08T23:33:29.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:30.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:30.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:30.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836623 -lt 21474836622 2026-03-08T23:33:30.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:30.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673097 2026-03-08T23:33:30.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:30.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:30.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673097 2026-03-08T23:33:30.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:30.730 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673097 2026-03-08T23:33:30.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673097 2026-03-08T23:33:30.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673097' 2026-03-08T23:33:30.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:30.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673097 -lt 42949673097 2026-03-08T23:33:30.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:30.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542274 2026-03-08T23:33:30.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:30.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:30.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542274 2026-03-08T23:33:30.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:30.898 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542274 2026-03-08T23:33:30.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542274 2026-03-08T23:33:30.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542274' 2026-03-08T23:33:30.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:31.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542274 -lt 60129542274 2026-03-08T23:33:31.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:31.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:31.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:31.228 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:31.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:31.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:31.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:31.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:31.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:31.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:31.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836626 2026-03-08T23:33:31.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836626 2026-03-08T23:33:31.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836626' 2026-03-08T23:33:31.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:31.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:31.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673100 2026-03-08T23:33:31.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673100 2026-03-08T23:33:31.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836626 1-42949673100' 2026-03-08T23:33:31.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:31.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:31.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542277 2026-03-08T23:33:31.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542277 2026-03-08T23:33:31.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836626 1-42949673100 2-60129542277' 2026-03-08T23:33:31.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:31.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836626 2026-03-08T23:33:31.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:31.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:31.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836626 2026-03-08T23:33:31.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:31.651 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836626 2026-03-08T23:33:31.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836626 2026-03-08T23:33:31.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836626' 2026-03-08T23:33:31.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:31.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836626 -lt 21474836626 2026-03-08T23:33:31.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:31.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673100 2026-03-08T23:33:31.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:31.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:31.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673100 2026-03-08T23:33:31.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:31.826 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673100 2026-03-08T23:33:31.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673100 2026-03-08T23:33:31.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673100' 2026-03-08T23:33:31.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:31.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673100 -lt 42949673100 2026-03-08T23:33:31.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:31.995 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542277 2026-03-08T23:33:31.995 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:31.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:31.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542277 2026-03-08T23:33:31.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:31.998 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542277 2026-03-08T23:33:31.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542277 2026-03-08T23:33:31.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542277' 2026-03-08T23:33:31.998 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:32.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542276 -lt 60129542277 2026-03-08T23:33:32.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:33.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:33.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:33.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542276 -lt 60129542277 2026-03-08T23:33:33.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:34.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:33:34.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:34.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542279 -lt 60129542277 2026-03-08T23:33:34.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:34.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:34.669 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:34.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:34.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:34.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:34.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:34.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:34.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:34.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836630 2026-03-08T23:33:34.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836630 2026-03-08T23:33:34.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630' 2026-03-08T23:33:34.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:34.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:35.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673104 2026-03-08T23:33:35.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673104 2026-03-08T23:33:35.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630 1-42949673104' 2026-03-08T23:33:35.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:35.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:35.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542282 2026-03-08T23:33:35.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542282 2026-03-08T23:33:35.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630 1-42949673104 2-60129542282' 2026-03-08T23:33:35.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:35.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836630 2026-03-08T23:33:35.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:35.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:35.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836630 2026-03-08T23:33:35.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:35.088 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836630 2026-03-08T23:33:35.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836630 2026-03-08T23:33:35.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836630' 2026-03-08T23:33:35.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:35.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836628 -lt 21474836630 2026-03-08T23:33:35.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:36.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:36.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:36.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836631 -lt 21474836630 2026-03-08T23:33:36.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:36.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673104 2026-03-08T23:33:36.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:36.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:36.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673104 2026-03-08T23:33:36.440 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:36.442 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673104 2026-03-08T23:33:36.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673104 2026-03-08T23:33:36.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673104' 2026-03-08T23:33:36.442 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:36.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673105 -lt 42949673104 2026-03-08T23:33:36.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:36.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542282 2026-03-08T23:33:36.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:36.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:36.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542282 2026-03-08T23:33:36.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:36.618 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542282 2026-03-08T23:33:36.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542282 2026-03-08T23:33:36.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542282' 2026-03-08T23:33:36.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:36.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542282 -lt 60129542282 2026-03-08T23:33:36.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:36.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:36.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:36.939 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:36.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:36.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:36.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:36.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:36.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:37.131 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:37.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836633 2026-03-08T23:33:37.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836633 2026-03-08T23:33:37.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633' 2026-03-08T23:33:37.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:37.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:37.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673108 2026-03-08T23:33:37.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673108 2026-03-08T23:33:37.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633 1-42949673108' 2026-03-08T23:33:37.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:37.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542285 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542285 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633 1-42949673108 2-60129542285' 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836633 2026-03-08T23:33:37.390 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:37.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:37.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836633 2026-03-08T23:33:37.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:37.393 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836633 2026-03-08T23:33:37.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836633 2026-03-08T23:33:37.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836633' 2026-03-08T23:33:37.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:37.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836631 -lt 21474836633 2026-03-08T23:33:37.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:38.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:38.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:38.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836634 -lt 21474836633 2026-03-08T23:33:38.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:38.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673108 2026-03-08T23:33:38.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:38.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673108 2026-03-08T23:33:38.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:38.746 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673108 2026-03-08T23:33:38.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673108 2026-03-08T23:33:38.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673108' 2026-03-08T23:33:38.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:38.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673108 -lt 42949673108 2026-03-08T23:33:38.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:38.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542285 2026-03-08T23:33:38.917 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:38.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:38.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542285 2026-03-08T23:33:38.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:38.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542285 2026-03-08T23:33:38.920 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542285 2026-03-08T23:33:38.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542285' 2026-03-08T23:33:38.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:39.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542285 -lt 60129542285 2026-03-08T23:33:39.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:39.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:39.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:39.248 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:39.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:39.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:39.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:39.427 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:39.427 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:39.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:39.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:39.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:39.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836637 2026-03-08T23:33:39.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836637 2026-03-08T23:33:39.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836637' 2026-03-08T23:33:39.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:39.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673111 2026-03-08T23:33:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673111 2026-03-08T23:33:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836637 1-42949673111' 2026-03-08T23:33:39.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:39.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542288 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542288 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836637 1-42949673111 2-60129542288' 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:39.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836637 2026-03-08T23:33:39.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:39.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836637 2026-03-08T23:33:39.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:39.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836637 2026-03-08T23:33:39.667 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836637 2026-03-08T23:33:39.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836637' 2026-03-08T23:33:39.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:39.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836637 -lt 21474836637 2026-03-08T23:33:39.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:39.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673111 2026-03-08T23:33:39.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:39.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:39.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673111 2026-03-08T23:33:39.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:39.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673111 2026-03-08T23:33:39.839 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673111 2026-03-08T23:33:39.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673111' 2026-03-08T23:33:39.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:40.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673111 -lt 42949673111 2026-03-08T23:33:40.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:40.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542288 2026-03-08T23:33:40.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:40.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:40.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542288 2026-03-08T23:33:40.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:40.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542288 2026-03-08T23:33:40.011 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542288 2026-03-08T23:33:40.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542288' 2026-03-08T23:33:40.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:40.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542287 -lt 60129542288 2026-03-08T23:33:40.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:41.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:41.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:41.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542287 -lt 60129542288 2026-03-08T23:33:41.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:42.347 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:33:42.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:42.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542290 -lt 60129542288 2026-03-08T23:33:42.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:42.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:42.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:42.674 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:42.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:42.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:42.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:42.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:42.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836641 2026-03-08T23:33:42.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836641 2026-03-08T23:33:42.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641' 2026-03-08T23:33:42.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:42.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:43.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673115 2026-03-08T23:33:43.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673115 2026-03-08T23:33:43.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641 1-42949673115' 2026-03-08T23:33:43.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:43.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:43.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542293 2026-03-08T23:33:43.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542293 2026-03-08T23:33:43.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641 1-42949673115 2-60129542293' 2026-03-08T23:33:43.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:43.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836641 2026-03-08T23:33:43.099 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:43.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:43.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836641 2026-03-08T23:33:43.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:43.102 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836641 2026-03-08T23:33:43.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836641 2026-03-08T23:33:43.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836641' 2026-03-08T23:33:43.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:43.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836639 -lt 21474836641 2026-03-08T23:33:43.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:44.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:44.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:44.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836642 -lt 21474836641 2026-03-08T23:33:44.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:44.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673115 2026-03-08T23:33:44.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:44.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:44.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673115 2026-03-08T23:33:44.448 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:44.449 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673115 2026-03-08T23:33:44.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673115 2026-03-08T23:33:44.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673115' 2026-03-08T23:33:44.449 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:44.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673116 -lt 42949673115 2026-03-08T23:33:44.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:44.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542293 2026-03-08T23:33:44.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:44.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:44.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542293 2026-03-08T23:33:44.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:44.630 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542293 2026-03-08T23:33:44.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542293 2026-03-08T23:33:44.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542293' 2026-03-08T23:33:44.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:44.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542293 -lt 60129542293 2026-03-08T23:33:44.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:44.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:44.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:44.957 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:44.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:44.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:44.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:44.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:44.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:45.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:45.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836644 2026-03-08T23:33:45.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836644 2026-03-08T23:33:45.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836644' 2026-03-08T23:33:45.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:45.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:45.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673119 2026-03-08T23:33:45.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673119 2026-03-08T23:33:45.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836644 1-42949673119' 2026-03-08T23:33:45.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:45.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542296 2026-03-08T23:33:45.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542296 2026-03-08T23:33:45.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836644 1-42949673119 2-60129542296' 2026-03-08T23:33:45.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:45.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836644 2026-03-08T23:33:45.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:45.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:45.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836644 2026-03-08T23:33:45.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:45.387 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836644 2026-03-08T23:33:45.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836644 2026-03-08T23:33:45.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836644' 2026-03-08T23:33:45.388 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:45.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836642 -lt 21474836644 2026-03-08T23:33:45.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:46.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:46.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:46.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836645 -lt 21474836644 2026-03-08T23:33:46.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:46.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673119 2026-03-08T23:33:46.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:46.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:46.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673119 2026-03-08T23:33:46.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:46.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673119 2026-03-08T23:33:46.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673119' 2026-03-08T23:33:46.728 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673119 2026-03-08T23:33:46.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:46.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673119 -lt 42949673119 2026-03-08T23:33:46.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:46.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542296 2026-03-08T23:33:46.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:46.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:46.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542296 2026-03-08T23:33:46.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:46.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542296 2026-03-08T23:33:46.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542296' 2026-03-08T23:33:46.902 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542296 2026-03-08T23:33:46.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:47.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542296 -lt 60129542296 2026-03-08T23:33:47.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:47.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:47.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:47.212 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:47.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:47.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:47.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:47.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:47.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:47.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:47.386 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:47.386 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:47.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:47.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:47.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:47.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836648 2026-03-08T23:33:47.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836648 2026-03-08T23:33:47.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836648' 2026-03-08T23:33:47.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:47.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:47.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673122 2026-03-08T23:33:47.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673122 2026-03-08T23:33:47.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836648 1-42949673122' 2026-03-08T23:33:47.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:47.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:47.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542299 2026-03-08T23:33:47.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542299 2026-03-08T23:33:47.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836648 1-42949673122 2-60129542299' 2026-03-08T23:33:47.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:47.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836648 2026-03-08T23:33:47.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:47.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:47.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836648 2026-03-08T23:33:47.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:47.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836648 2026-03-08T23:33:47.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836648' 2026-03-08T23:33:47.631 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836648 2026-03-08T23:33:47.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:47.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836648 -lt 21474836648 2026-03-08T23:33:47.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:47.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673122 2026-03-08T23:33:47.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:47.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:47.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673122 2026-03-08T23:33:47.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:47.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673122 2026-03-08T23:33:47.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673122' 2026-03-08T23:33:47.809 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673122 2026-03-08T23:33:47.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:47.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673122 -lt 42949673122 2026-03-08T23:33:47.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:47.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542299 2026-03-08T23:33:47.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:47.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:47.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542299 2026-03-08T23:33:47.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542299 2026-03-08T23:33:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542299' 2026-03-08T23:33:47.989 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542299 2026-03-08T23:33:47.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:48.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542298 -lt 60129542299 2026-03-08T23:33:48.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:49.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:49.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:49.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542298 -lt 60129542299 2026-03-08T23:33:49.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:50.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:33:50.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:50.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542301 -lt 60129542299 2026-03-08T23:33:50.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:50.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:50.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:50.659 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:50.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:50.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:50.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:50.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:50.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836652 2026-03-08T23:33:50.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836652 2026-03-08T23:33:50.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836652' 2026-03-08T23:33:50.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:50.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:51.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673126 2026-03-08T23:33:51.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673126 2026-03-08T23:33:51.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836652 1-42949673126' 2026-03-08T23:33:51.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:51.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542304 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542304 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836652 1-42949673126 2-60129542304' 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836652 2026-03-08T23:33:51.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:51.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:51.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836652 2026-03-08T23:33:51.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:51.084 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836652 2026-03-08T23:33:51.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836652 2026-03-08T23:33:51.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836652' 2026-03-08T23:33:51.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:51.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836650 -lt 21474836652 2026-03-08T23:33:51.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:52.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:52.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:52.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836653 -lt 21474836652 2026-03-08T23:33:52.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:52.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673126 2026-03-08T23:33:52.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:52.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:52.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673126 2026-03-08T23:33:52.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:52.428 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673126 2026-03-08T23:33:52.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673126 2026-03-08T23:33:52.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673126' 2026-03-08T23:33:52.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:52.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673127 -lt 42949673126 2026-03-08T23:33:52.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:52.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542304 2026-03-08T23:33:52.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:52.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:52.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542304 2026-03-08T23:33:52.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:52.609 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542304 2026-03-08T23:33:52.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542304 2026-03-08T23:33:52.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542304' 2026-03-08T23:33:52.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:52.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542304 -lt 60129542304 2026-03-08T23:33:52.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:52.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:52.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:52.927 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:52.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:52.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:52.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:52.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:52.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:53.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:53.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836655 2026-03-08T23:33:53.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836655 2026-03-08T23:33:53.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655' 2026-03-08T23:33:53.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:53.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673129 2026-03-08T23:33:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673129 2026-03-08T23:33:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655 1-42949673129' 2026-03-08T23:33:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:53.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:53.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542307 2026-03-08T23:33:53.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542307 2026-03-08T23:33:53.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655 1-42949673129 2-60129542307' 2026-03-08T23:33:53.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836655 2026-03-08T23:33:53.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:53.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:53.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836655 2026-03-08T23:33:53.331 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:53.332 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836655 2026-03-08T23:33:53.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836655 2026-03-08T23:33:53.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836655' 2026-03-08T23:33:53.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:53.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836653 -lt 21474836655 2026-03-08T23:33:53.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:54.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:54.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:54.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836656 -lt 21474836655 2026-03-08T23:33:54.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:54.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673129 2026-03-08T23:33:54.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:54.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:54.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673129 2026-03-08T23:33:54.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:54.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673129 2026-03-08T23:33:54.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673129' 2026-03-08T23:33:54.662 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673129 2026-03-08T23:33:54.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:54.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673130 -lt 42949673129 2026-03-08T23:33:54.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:54.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542307 2026-03-08T23:33:54.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:54.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542307 2026-03-08T23:33:54.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:54.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542307 2026-03-08T23:33:54.834 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542307 2026-03-08T23:33:54.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542307' 2026-03-08T23:33:54.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:55.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542307 -lt 60129542307 2026-03-08T23:33:55.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:55.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:55.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:55.150 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:55.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:55.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:55.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:55.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:55.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:55.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836658 2026-03-08T23:33:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836658 2026-03-08T23:33:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658' 2026-03-08T23:33:55.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:55.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:55.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673133 2026-03-08T23:33:55.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673133 2026-03-08T23:33:55.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658 1-42949673133' 2026-03-08T23:33:55.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:55.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:55.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542310 2026-03-08T23:33:55.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542310 2026-03-08T23:33:55.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658 1-42949673133 2-60129542310' 2026-03-08T23:33:55.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:55.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836658 2026-03-08T23:33:55.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:55.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:55.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836658 2026-03-08T23:33:55.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:55.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836658 2026-03-08T23:33:55.573 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836658 2026-03-08T23:33:55.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836658' 2026-03-08T23:33:55.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:55.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836659 -lt 21474836658 2026-03-08T23:33:55.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:55.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673133 2026-03-08T23:33:55.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:55.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:55.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673133 2026-03-08T23:33:55.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:55.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673133 2026-03-08T23:33:55.746 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673133 2026-03-08T23:33:55.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673133' 2026-03-08T23:33:55.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:55.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673133 -lt 42949673133 2026-03-08T23:33:55.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:55.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542310 2026-03-08T23:33:55.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:55.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:55.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542310 2026-03-08T23:33:55.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:55.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542310 2026-03-08T23:33:55.912 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542310 2026-03-08T23:33:55.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542310' 2026-03-08T23:33:55.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:56.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542310 -lt 60129542310 2026-03-08T23:33:56.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:56.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:56.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:56.234 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:56.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:56.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:56.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:56.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:56.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:56.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:56.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836661 2026-03-08T23:33:56.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836661 2026-03-08T23:33:56.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836661' 2026-03-08T23:33:56.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:56.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:56.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673135 2026-03-08T23:33:56.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673135 2026-03-08T23:33:56.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836661 1-42949673135' 2026-03-08T23:33:56.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:56.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:56.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542312 2026-03-08T23:33:56.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542312 2026-03-08T23:33:56.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836661 1-42949673135 2-60129542312' 2026-03-08T23:33:56.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:56.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836661 2026-03-08T23:33:56.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:56.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:56.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836661 2026-03-08T23:33:56.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:56.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836661 2026-03-08T23:33:56.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836661' 2026-03-08T23:33:56.637 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836661 2026-03-08T23:33:56.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:56.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836659 -lt 21474836661 2026-03-08T23:33:56.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:33:57.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:33:57.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:57.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836662 -lt 21474836661 2026-03-08T23:33:57.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:57.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673135 2026-03-08T23:33:57.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:57.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:33:57.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673135 2026-03-08T23:33:57.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:57.971 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673135 2026-03-08T23:33:57.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673135 2026-03-08T23:33:57.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673135' 2026-03-08T23:33:57.971 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:33:58.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673136 -lt 42949673135 2026-03-08T23:33:58.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:58.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542312 2026-03-08T23:33:58.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:58.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:33:58.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542312 2026-03-08T23:33:58.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:58.135 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542312 2026-03-08T23:33:58.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542312 2026-03-08T23:33:58.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542312' 2026-03-08T23:33:58.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:33:58.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542313 -lt 60129542312 2026-03-08T23:33:58.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:33:58.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:33:58.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:33:58.465 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:33:58.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:33:58.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:33:58.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:33:58.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:33:58.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:58.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:33:58.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836664 2026-03-08T23:33:58.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836664 2026-03-08T23:33:58.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664' 2026-03-08T23:33:58.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:58.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:33:58.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673138 2026-03-08T23:33:58.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673138 2026-03-08T23:33:58.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664 1-42949673138' 2026-03-08T23:33:58.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:33:58.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:33:58.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542315 2026-03-08T23:33:58.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542315 2026-03-08T23:33:58.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664 1-42949673138 2-60129542315' 2026-03-08T23:33:58.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:33:58.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836664 2026-03-08T23:33:58.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:33:58.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:33:58.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836664 2026-03-08T23:33:58.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:33:58.886 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836664 2026-03-08T23:33:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836664 2026-03-08T23:33:58.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836664' 2026-03-08T23:33:58.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:33:59.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836662 -lt 21474836664 2026-03-08T23:33:59.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:00.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:00.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:00.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836665 -lt 21474836664 2026-03-08T23:34:00.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:00.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673138 2026-03-08T23:34:00.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:00.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:00.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673138 2026-03-08T23:34:00.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:00.275 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673138 2026-03-08T23:34:00.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673138 2026-03-08T23:34:00.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673138' 2026-03-08T23:34:00.276 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:00.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673139 -lt 42949673138 2026-03-08T23:34:00.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:00.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542315 2026-03-08T23:34:00.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:00.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:00.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542315 2026-03-08T23:34:00.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:00.491 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542315 2026-03-08T23:34:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542315 2026-03-08T23:34:00.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542315' 2026-03-08T23:34:00.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:00.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542316 -lt 60129542315 2026-03-08T23:34:00.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:00.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:00.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:00.829 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:00.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:01.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:01.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836667 2026-03-08T23:34:01.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836667 2026-03-08T23:34:01.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667' 2026-03-08T23:34:01.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:01.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:01.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673141 2026-03-08T23:34:01.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673141 2026-03-08T23:34:01.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667 1-42949673141' 2026-03-08T23:34:01.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:01.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:01.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542319 2026-03-08T23:34:01.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542319 2026-03-08T23:34:01.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667 1-42949673141 2-60129542319' 2026-03-08T23:34:01.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:01.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836667 2026-03-08T23:34:01.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:01.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:01.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836667 2026-03-08T23:34:01.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:01.246 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836667 2026-03-08T23:34:01.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836667 2026-03-08T23:34:01.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836667' 2026-03-08T23:34:01.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:01.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836665 -lt 21474836667 2026-03-08T23:34:01.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:02.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:02.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:02.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836668 -lt 21474836667 2026-03-08T23:34:02.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:02.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673141 2026-03-08T23:34:02.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:02.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:02.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673141 2026-03-08T23:34:02.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:02.589 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673141 2026-03-08T23:34:02.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673141 2026-03-08T23:34:02.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673141' 2026-03-08T23:34:02.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:02.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673142 -lt 42949673141 2026-03-08T23:34:02.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:02.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542319 2026-03-08T23:34:02.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:02.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:02.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542319 2026-03-08T23:34:02.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:02.754 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542319 2026-03-08T23:34:02.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542319 2026-03-08T23:34:02.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542319' 2026-03-08T23:34:02.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:02.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542319 -lt 60129542319 2026-03-08T23:34:02.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:02.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:02.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:03.072 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:03.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:03.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:03.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:03.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:03.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:03.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:03.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836670 2026-03-08T23:34:03.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836670 2026-03-08T23:34:03.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836670' 2026-03-08T23:34:03.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:03.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:03.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673145 2026-03-08T23:34:03.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673145 2026-03-08T23:34:03.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836670 1-42949673145' 2026-03-08T23:34:03.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:03.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542322 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542322 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836670 1-42949673145 2-60129542322' 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836670 2026-03-08T23:34:03.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:03.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:03.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836670 2026-03-08T23:34:03.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:03.490 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836670 2026-03-08T23:34:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836670 2026-03-08T23:34:03.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836670' 2026-03-08T23:34:03.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:03.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836671 -lt 21474836670 2026-03-08T23:34:03.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:03.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673145 2026-03-08T23:34:03.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:03.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:03.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673145 2026-03-08T23:34:03.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:03.658 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673145 2026-03-08T23:34:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673145 2026-03-08T23:34:03.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673145' 2026-03-08T23:34:03.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:03.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673145 -lt 42949673145 2026-03-08T23:34:03.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:03.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542322 2026-03-08T23:34:03.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:03.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:03.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542322 2026-03-08T23:34:03.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:03.914 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542322 2026-03-08T23:34:03.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542322 2026-03-08T23:34:03.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542322' 2026-03-08T23:34:03.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:04.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542322 -lt 60129542322 2026-03-08T23:34:04.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:04.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:04.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:04.308 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:04.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:04.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:04.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:04.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:04.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:04.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:04.492 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:04.492 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:04.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:04.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:04.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:04.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836673 2026-03-08T23:34:04.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836673 2026-03-08T23:34:04.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673' 2026-03-08T23:34:04.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:04.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673147 2026-03-08T23:34:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673147 2026-03-08T23:34:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673 1-42949673147' 2026-03-08T23:34:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:04.657 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542324 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542324 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673 1-42949673147 2-60129542324' 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836673 2026-03-08T23:34:04.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:04.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:04.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836673 2026-03-08T23:34:04.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:04.747 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836673 2026-03-08T23:34:04.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836673 2026-03-08T23:34:04.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836673' 2026-03-08T23:34:04.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:04.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836671 -lt 21474836673 2026-03-08T23:34:04.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:05.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:05.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:06.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836674 -lt 21474836673 2026-03-08T23:34:06.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:06.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673147 2026-03-08T23:34:06.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:06.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:06.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673147 2026-03-08T23:34:06.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:06.093 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673147 2026-03-08T23:34:06.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673147 2026-03-08T23:34:06.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673147' 2026-03-08T23:34:06.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:06.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673148 -lt 42949673147 2026-03-08T23:34:06.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:06.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542324 2026-03-08T23:34:06.277 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:06.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:06.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542324 2026-03-08T23:34:06.279 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:06.280 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542324 2026-03-08T23:34:06.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542324 2026-03-08T23:34:06.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542324' 2026-03-08T23:34:06.280 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:06.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542325 -lt 60129542324 2026-03-08T23:34:06.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:06.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:06.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:06.624 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:06.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:06.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:06.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:06.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:06.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:06.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:06.807 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:06.807 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:06.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:06.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:06.807 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836676 2026-03-08T23:34:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836676 2026-03-08T23:34:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676' 2026-03-08T23:34:06.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:06.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:06.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673150 2026-03-08T23:34:06.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673150 2026-03-08T23:34:06.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676 1-42949673150' 2026-03-08T23:34:06.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:06.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542328 2026-03-08T23:34:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542328 2026-03-08T23:34:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676 1-42949673150 2-60129542328' 2026-03-08T23:34:07.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:07.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836676 2026-03-08T23:34:07.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:07.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:07.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836676 2026-03-08T23:34:07.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:07.061 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836676 2026-03-08T23:34:07.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836676 2026-03-08T23:34:07.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836676' 2026-03-08T23:34:07.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:07.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836674 -lt 21474836676 2026-03-08T23:34:07.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:08.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:08.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:08.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836677 -lt 21474836676 2026-03-08T23:34:08.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:08.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673150 2026-03-08T23:34:08.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:08.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:08.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673150 2026-03-08T23:34:08.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:08.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673150 2026-03-08T23:34:08.408 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673150 2026-03-08T23:34:08.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673150' 2026-03-08T23:34:08.408 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:08.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673151 -lt 42949673150 2026-03-08T23:34:08.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:08.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542328 2026-03-08T23:34:08.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:08.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:08.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542328 2026-03-08T23:34:08.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:08.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542328 2026-03-08T23:34:08.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542328' 2026-03-08T23:34:08.648 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542328 2026-03-08T23:34:08.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:08.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542328 -lt 60129542328 2026-03-08T23:34:08.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:08.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:08.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:08.995 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:09.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:09.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:09.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:09.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:09.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:09.211 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:09.211 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:09.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:09.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:09.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836679 2026-03-08T23:34:09.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836679 2026-03-08T23:34:09.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679' 2026-03-08T23:34:09.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:09.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:09.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673154 2026-03-08T23:34:09.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673154 2026-03-08T23:34:09.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679 1-42949673154' 2026-03-08T23:34:09.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:09.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542331 2026-03-08T23:34:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542331 2026-03-08T23:34:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679 1-42949673154 2-60129542331' 2026-03-08T23:34:09.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:09.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836679 2026-03-08T23:34:09.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:09.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:09.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836679 2026-03-08T23:34:09.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836679 2026-03-08T23:34:09.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836679' 2026-03-08T23:34:09.462 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836679 2026-03-08T23:34:09.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:09.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836677 -lt 21474836679 2026-03-08T23:34:09.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:10.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:10.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:10.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836680 -lt 21474836679 2026-03-08T23:34:10.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:10.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673154 2026-03-08T23:34:10.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:10.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:10.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673154 2026-03-08T23:34:10.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:10.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673154 2026-03-08T23:34:10.825 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673154 2026-03-08T23:34:10.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673154' 2026-03-08T23:34:10.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:10.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673154 -lt 42949673154 2026-03-08T23:34:10.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:10.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542331 2026-03-08T23:34:10.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:11.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:11.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542331 2026-03-08T23:34:11.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:11.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542331 2026-03-08T23:34:11.002 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542331 2026-03-08T23:34:11.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542331' 2026-03-08T23:34:11.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:11.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542331 -lt 60129542331 2026-03-08T23:34:11.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:11.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:11.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:11.332 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:11.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:11.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:11.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:11.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:11.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:11.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:11.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836683 2026-03-08T23:34:11.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836683 2026-03-08T23:34:11.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836683' 2026-03-08T23:34:11.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:11.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:11.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673157 2026-03-08T23:34:11.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673157 2026-03-08T23:34:11.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836683 1-42949673157' 2026-03-08T23:34:11.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:11.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542334 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542334 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836683 1-42949673157 2-60129542334' 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836683 2026-03-08T23:34:11.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:11.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:11.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836683 2026-03-08T23:34:11.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:11.745 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836683 2026-03-08T23:34:11.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836683 2026-03-08T23:34:11.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836683' 2026-03-08T23:34:11.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:11.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836683 -lt 21474836683 2026-03-08T23:34:11.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:11.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673157 2026-03-08T23:34:11.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:11.912 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:11.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673157 2026-03-08T23:34:11.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:11.913 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673157 2026-03-08T23:34:11.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673157 2026-03-08T23:34:11.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673157' 2026-03-08T23:34:11.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:12.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673156 -lt 42949673157 2026-03-08T23:34:12.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:13.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:13.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:13.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673156 -lt 42949673157 2026-03-08T23:34:13.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:14.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:34:14.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:14.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673159 -lt 42949673157 2026-03-08T23:34:14.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:14.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542334 2026-03-08T23:34:14.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:14.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:14.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542334 2026-03-08T23:34:14.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:14.419 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542334 2026-03-08T23:34:14.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542334 2026-03-08T23:34:14.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542334' 2026-03-08T23:34:14.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:14.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542336 -lt 60129542334 2026-03-08T23:34:14.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:14.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:14.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:14.743 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:14.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:14.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:14.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:14.925 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:14.925 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:14.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:14.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:14.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:15.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836687 2026-03-08T23:34:15.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836687 2026-03-08T23:34:15.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687' 2026-03-08T23:34:15.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:15.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:15.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673161 2026-03-08T23:34:15.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673161 2026-03-08T23:34:15.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687 1-42949673161' 2026-03-08T23:34:15.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:15.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:15.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542339 2026-03-08T23:34:15.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542339 2026-03-08T23:34:15.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687 1-42949673161 2-60129542339' 2026-03-08T23:34:15.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:15.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836687 2026-03-08T23:34:15.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:15.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:15.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836687 2026-03-08T23:34:15.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:15.203 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836687 2026-03-08T23:34:15.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836687 2026-03-08T23:34:15.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836687' 2026-03-08T23:34:15.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:15.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836685 -lt 21474836687 2026-03-08T23:34:15.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:16.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:16.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:16.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836688 -lt 21474836687 2026-03-08T23:34:16.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:16.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673161 2026-03-08T23:34:16.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:16.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:16.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673161 2026-03-08T23:34:16.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:16.556 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673161 2026-03-08T23:34:16.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673161 2026-03-08T23:34:16.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673161' 2026-03-08T23:34:16.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:16.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673162 -lt 42949673161 2026-03-08T23:34:16.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:16.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542339 2026-03-08T23:34:16.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:16.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:16.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542339 2026-03-08T23:34:16.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:16.727 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542339 2026-03-08T23:34:16.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542339 2026-03-08T23:34:16.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542339' 2026-03-08T23:34:16.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:16.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542339 -lt 60129542339 2026-03-08T23:34:16.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:16.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:16.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:17.069 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:17.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:17.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:17.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:17.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:17.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:17.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:17.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836690 2026-03-08T23:34:17.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836690 2026-03-08T23:34:17.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690' 2026-03-08T23:34:17.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:17.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:17.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673165 2026-03-08T23:34:17.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673165 2026-03-08T23:34:17.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690 1-42949673165' 2026-03-08T23:34:17.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:17.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542342 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542342 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690 1-42949673165 2-60129542342' 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836690 2026-03-08T23:34:17.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:17.497 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:17.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836690 2026-03-08T23:34:17.497 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:17.498 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836690 2026-03-08T23:34:17.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836690 2026-03-08T23:34:17.498 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836690' 2026-03-08T23:34:17.498 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:17.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836691 -lt 21474836690 2026-03-08T23:34:17.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:17.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673165 2026-03-08T23:34:17.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:17.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:17.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673165 2026-03-08T23:34:17.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:17.668 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673165 2026-03-08T23:34:17.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673165 2026-03-08T23:34:17.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673165' 2026-03-08T23:34:17.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:17.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673165 -lt 42949673165 2026-03-08T23:34:17.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:17.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542342 2026-03-08T23:34:17.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:17.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:17.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542342 2026-03-08T23:34:17.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:17.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542342 2026-03-08T23:34:17.835 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542342 2026-03-08T23:34:17.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542342' 2026-03-08T23:34:17.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:18.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542342 -lt 60129542342 2026-03-08T23:34:18.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:18.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:18.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:18.161 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:18.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:18.175 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:18.344 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:18.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836693 2026-03-08T23:34:18.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836693 2026-03-08T23:34:18.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836693' 2026-03-08T23:34:18.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:18.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:18.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673167 2026-03-08T23:34:18.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673167 2026-03-08T23:34:18.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836693 1-42949673167' 2026-03-08T23:34:18.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:18.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542344 2026-03-08T23:34:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542344 2026-03-08T23:34:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836693 1-42949673167 2-60129542344' 2026-03-08T23:34:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:18.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836693 2026-03-08T23:34:18.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:18.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:18.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836693 2026-03-08T23:34:18.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:18.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836693 2026-03-08T23:34:18.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836693' 2026-03-08T23:34:18.593 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836693 2026-03-08T23:34:18.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:18.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836691 -lt 21474836693 2026-03-08T23:34:18.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:19.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:19.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:19.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836694 -lt 21474836693 2026-03-08T23:34:19.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:19.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673167 2026-03-08T23:34:19.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:19.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:19.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673167 2026-03-08T23:34:19.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:19.936 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673167 2026-03-08T23:34:19.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673167 2026-03-08T23:34:19.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673167' 2026-03-08T23:34:19.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:20.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673168 -lt 42949673167 2026-03-08T23:34:20.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:20.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542344 2026-03-08T23:34:20.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:20.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:20.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542344 2026-03-08T23:34:20.107 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:20.108 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542344 2026-03-08T23:34:20.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542344 2026-03-08T23:34:20.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542344' 2026-03-08T23:34:20.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:20.279 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542345 -lt 60129542344 2026-03-08T23:34:20.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:20.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:20.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:20.430 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:34:20.444 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:20.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:34:20.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836696 2026-03-08T23:34:20.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836696 2026-03-08T23:34:20.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836696' 2026-03-08T23:34:20.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:20.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:34:20.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673170 2026-03-08T23:34:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673170 2026-03-08T23:34:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836696 1-42949673170' 2026-03-08T23:34:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:34:20.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:34:20.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542347 2026-03-08T23:34:20.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542347 2026-03-08T23:34:20.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836696 1-42949673170 2-60129542347' 2026-03-08T23:34:20.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:20.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836696 2026-03-08T23:34:20.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:20.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:34:20.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836696 2026-03-08T23:34:20.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:20.855 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836696 2026-03-08T23:34:20.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836696 2026-03-08T23:34:20.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836696' 2026-03-08T23:34:20.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:21.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836694 -lt 21474836696 2026-03-08T23:34:21.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:34:22.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:34:22.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:34:22.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836697 -lt 21474836696 2026-03-08T23:34:22.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:22.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673170 2026-03-08T23:34:22.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:22.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:34:22.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673170 2026-03-08T23:34:22.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:22.190 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673170 2026-03-08T23:34:22.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673170 2026-03-08T23:34:22.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673170' 2026-03-08T23:34:22.191 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:34:22.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673171 -lt 42949673170 2026-03-08T23:34:22.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:34:22.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542347 2026-03-08T23:34:22.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:34:22.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:34:22.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542347 2026-03-08T23:34:22.372 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:34:22.373 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542347 2026-03-08T23:34:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542347 2026-03-08T23:34:22.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542347' 2026-03-08T23:34:22.373 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:34:22.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542348 -lt 60129542347 2026-03-08T23:34:22.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:34:22.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:34:22.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:34:22.694 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:34:22.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:383: _scrub_abort: break 2026-03-08T23:34:22.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:385: _scrub_abort: set +o pipefail 2026-03-08T23:34:22.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:387: _scrub_abort: sleep 5 2026-03-08T23:34:27.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:389: _scrub_abort: grep 'nodeep_scrub set, aborting' td/osd-scrub-test/osd.1.log 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:32:25.780+0000 7f2184d2d640 10 osd.1 pg_epoch: 20 pg[1.0( v 18'1000 (0'0,18'1000] local-lis/les=15/17 n=1000 ec=15/15 lis/c=15/15 les/c/f=17/17/0 sis=15) [1,0,2] r=0 lpr=15 crt=18'1000 lcod 18'999 mlcod 18'999 active+clean+scrubbing+deep [ 1.0: ] TIME_FOR_DEEP] scrubber: nodeep_scrub set, aborting 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:395: _scrub_abort: get_last_scrub_stamp 1.0 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:27.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:27.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:395: _scrub_abort: local last_scrub=2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:27.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:396: _scrub_abort: ceph config set osd osd_scrub_sleep 0.1 2026-03-08T23:34:28.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:398: _scrub_abort: ceph osd unset nodeep-scrub 2026-03-08T23:34:28.267 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-08T23:34:28.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:399: _scrub_abort: '[' deep-scrub = deep-scrub ']' 2026-03-08T23:34:28.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:401: _scrub_abort: ceph osd unset noscrub 2026-03-08T23:34:28.469 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:403: _scrub_abort: TIMEOUT=500 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:404: _scrub_abort: wait_for_scrub 1.0 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:28.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:28.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:28.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:28.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:28.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:29.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:29.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:29.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:29.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:29.664 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:29.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:29.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:29.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:29.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:30.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:30.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:31.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:31.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:32.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:32.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:32.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:32.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:33.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:33.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:33.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:33.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:33.211 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:33.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:33.212 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:33.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:33.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:34.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:34.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:34.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:34.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:35.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:35.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:35.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:35.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:35.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:35.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:35.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:35.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:35.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:36.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:36.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:36.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:36.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:37.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:37.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:37.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:37.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:37.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:37.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:37.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:38.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:38.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:39.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:39.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:39.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:40.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:40.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:40.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:40.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:40.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:40.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:40.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:40.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:40.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:41.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:41.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:41.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:42.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:42.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:42.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:42.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:42.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:42.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:42.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:42.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:42.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:43.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:43.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:43.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:44.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:44.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:44.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:44.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:44.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:44.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:44.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:45.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:45.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:46.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:46.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:46.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:47.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:47.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:47.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:47.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:47.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:47.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:47.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:47.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:47.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:48.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:48.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:48.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:48.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:49.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:49.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:49.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:49.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:49.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:49.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:49.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:50.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:50.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:50.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:52.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:52.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:53.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:53.082 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:53.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:53.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:53.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:53.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:53.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:53.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:54.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:54.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:54.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:54.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:54.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:54.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:54.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:54.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:54.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:55.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:55.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:55.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:55.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:56.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:56.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:56.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:57.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:57.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:57.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:57.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:57.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:57.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:57.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:57.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:34:58.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:34:59.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:34:59.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:00.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:00.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:00.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:01.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:02.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:02.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:02.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:03.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:03.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:03.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:04.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:04.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:04.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:04.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:04.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:04.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:04.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:04.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:04.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:05.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:06.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:06.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:07.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:07.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:07.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:07.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:07.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:07.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:07.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:07.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:08.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:08.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:08.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:09.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:09.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:09.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:09.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:09.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:09.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:09.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:09.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:09.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:10.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:10.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:10.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:11.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:11.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:11.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:12.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:12.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:12.867 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:12.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:12.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:12.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:12.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:13.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:13.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:14.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:14.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:14.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:15.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:15.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:15.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:16.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:16.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:16.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:16.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:16.346 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:16.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:16.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:17.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:17.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:17.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:18.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:18.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:18.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:18.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:18.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:18.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:18.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:18.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:19.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:19.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:19.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:19.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:19.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:19.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:19.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:20.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:20.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:21.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:21.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:21.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:21.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:22.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:22.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:22.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:23.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:23.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:23.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:24.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:24.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:24.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:24.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:24.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:24.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:24.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:24.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:24.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:25.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:25.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:25.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:25.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:25.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:25.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:25.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:25.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:25.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:26.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:27.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:27.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:28.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:28.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:28.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:29.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:29.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:29.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:30.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:30.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:30.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:31.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:31.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:31.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:32.744 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:32.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:32.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:33.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:34.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:34.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:35.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:35.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:35.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:36.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:36.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:36.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:36.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:36.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:36.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:36.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:36.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:36.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:37.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:37.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:37.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:37.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:38.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:38.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:38.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:38.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:39.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:39.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:39.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:39.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:39.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:39.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:39.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:39.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:39.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:40.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:40.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:40.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:40.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:40.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:40.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:40.899 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:41.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:41.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:42.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:42.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:42.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:42.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:43.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:43.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:43.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:44.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:44.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:44.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:44.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:44.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:44.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:44.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:44.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:44.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:45.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:45.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:45.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:46.812 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:46.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:46.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:46.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:47.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:47.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:48.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:48.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:49.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:49.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:49.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:49.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:49.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:49.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:49.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:49.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:49.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:50.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:50.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:51.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:51.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:51.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:52.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:52.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:52.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:52.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:52.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:52.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:52.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:52.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:52.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:53.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:54.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:54.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:55.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:55.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:55.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:56.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:56.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:56.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:56.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:56.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:56.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:56.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:56.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:56.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:57.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:57.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:57.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:57.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:57.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:57.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:57.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:57.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:58.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:58.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:58.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:58.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:58.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:58.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:58.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:58.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:35:59.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:35:59.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:35:59.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:35:59.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:00.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:01.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:01.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:02.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:02.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:02.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:03.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:03.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:03.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:03.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:03.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:03.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:03.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:03.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:03.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:04.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:04.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:04.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:04.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:04.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:04.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:04.422 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:04.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:04.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:05.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:05.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:05.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:06.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:06.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:06.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:06.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:06.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:06.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:06.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:06.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:07.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:08.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:08.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:08.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:09.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:09.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:09.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:09.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:09.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:09.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:09.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:09.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:09.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:10.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:10.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:10.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:10.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:10.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:10.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:10.357 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:10.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:10.548 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:11.549 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:11.550 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:11.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:11.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:12.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:12.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:12.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:12.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:12.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:12.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:12.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:13.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:13.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:13.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:13.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:13.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:13.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:13.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:14.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:14.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:15.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:15.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:15.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:15.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:15.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:16.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:16.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:16.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:17.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:17.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:17.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:18.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:18.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:18.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:18.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:18.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:18.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:18.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:18.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:18.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:19.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:19.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:19.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:19.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:19.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:19.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:19.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:19.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:19.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:20.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:21.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:21.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:22.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:22.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:22.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:23.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:23.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:23.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:24.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:24.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:24.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:24.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:25.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:25.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:25.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:26.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:26.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:26.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:26.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:26.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:26.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:26.833 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:27.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:27.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:28.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:28.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:28.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:28.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:28.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:28.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:28.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:28.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:28.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:29.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:29.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:29.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:30.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:30.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:30.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:31.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:31.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:31.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:31.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:32.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:32.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:32.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:33.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:33.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:33.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:33.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:33.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:33.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:33.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:34.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:34.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:35.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:35.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:35.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:35.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:35.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:35.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:35.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:35.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:35.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:36.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:36.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:36.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:36.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:36.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:36.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:36.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:36.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:36.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:37.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:37.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:37.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:37.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:37.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:37.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:37.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:37.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:37.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:38.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:38.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:38.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:38.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:38.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:38.537 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:38.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:38.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:38.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:39.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:39.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:39.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:39.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:40.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:40.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:40.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:40.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:40.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:40.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:40.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:41.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:41.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:42.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:42.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:42.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:42.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:42.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:42.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:42.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:42.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:42.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:43.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:43.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:43.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:43.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:43.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:43.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:43.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:43.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:44.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:44.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:44.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:45.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:45.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:45.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:45.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:46.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:46.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:46.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:46.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:47.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:47.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:47.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:47.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:47.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:48.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:48.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:49.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:49.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:49.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:49.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:49.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:49.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:49.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:49.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:49.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:50.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:50.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:50.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:50.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:51.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:51.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:51.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:51.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:51.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:51.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:51.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:51.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:51.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:52.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:52.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:52.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:53.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:53.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:53.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:54.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:54.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:54.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:54.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:54.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:54.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:54.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:55.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:55.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:56.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:56.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:56.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:56.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:56.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:56.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:56.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:56.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:56.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:57.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:57.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:57.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:57.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:57.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:57.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:57.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:57.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:57.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:58.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:58.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:58.438 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:58.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:58.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:36:59.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:36:59.611 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:36:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:36:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:36:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:36:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:36:59.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:36:59.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:36:59.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:00.787 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:00.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:00.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:00.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:00.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:01.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:01.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:01.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:01.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:01.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:01.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:01.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:02.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:02.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:03.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:03.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:03.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:04.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:04.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:04.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:04.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:05.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:05.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:05.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:05.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:05.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:05.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:05.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:05.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:05.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:06.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:06.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:06.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:06.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:06.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:06.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:06.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:06.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:06.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:07.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:07.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:07.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:08.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:08.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:08.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:09.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:09.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:10.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:10.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:10.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:10.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:11.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:11.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:11.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:12.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:12.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:12.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:12.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:12.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:12.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:12.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:12.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:12.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:13.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:13.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:13.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:13.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:14.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:14.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:14.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:14.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:14.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:14.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:14.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:15.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:16.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:16.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:16.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:17.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:17.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:17.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:17.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:17.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:17.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:17.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:17.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:17.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:18.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:18.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:18.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:18.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:18.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:18.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:18.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:18.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:18.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:19.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:19.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:19.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:19.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:19.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:19.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:19.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:19.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:19.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:20.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:20.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:20.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:21.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:22.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:22.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:23.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:23.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:23.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:23.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:23.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:23.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:23.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:23.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:23.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:24.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:24.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:24.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:24.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:24.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:24.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:24.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:24.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:24.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:25.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:25.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:25.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:25.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:25.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:25.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:25.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:25.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:25.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:26.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:26.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:26.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:26.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:27.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:27.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:27.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:27.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:28.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:29.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:29.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:30.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:30.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:30.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:30.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:31.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:31.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:31.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:31.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:31.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:31.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:31.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:31.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:31.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:32.389 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:32.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:32.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:33.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:33.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:33.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:33.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:33.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:33.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:33.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:33.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:33.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:34.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:34.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:34.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:34.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:35.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:35.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:35.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:35.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:35.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:35.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:35.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:36.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:36.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:37.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:37.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:37.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:38.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:38.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:38.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:39.367 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:39.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:39.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:39.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:40.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:40.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:40.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:41.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:41.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:41.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:41.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:41.700 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:41.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:41.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:41.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:41.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:42.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:43.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:43.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:44.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:44.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:44.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:45.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:45.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:45.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:46.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:46.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:46.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:47.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:47.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:47.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:48.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:48.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:48.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:49.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:50.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:51.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:51.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:51.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:52.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:52.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:52.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:53.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:53.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:53.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:53.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:53.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:53.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:53.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:53.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:53.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:54.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:54.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:54.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:54.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:55.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:55.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:55.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:55.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:56.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:57.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:58.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:58.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:58.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:58.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:58.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:58.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:58.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:58.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:58.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:37:59.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:37:59.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:37:59.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:00.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:00.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:00.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:01.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:01.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:01.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:01.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:01.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:01.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:01.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:01.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:01.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:02.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:02.661 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:02.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:02.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:02.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:02.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:02.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:02.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:02.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:03.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:03.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:04.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:04.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:04.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:04.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:04.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:04.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:04.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:05.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:06.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:06.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:06.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:06.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:06.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:06.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:06.148 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:06.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:06.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:07.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:07.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:07.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:07.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:07.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:07.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:07.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:07.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:07.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:08.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:08.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:09.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:09.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:09.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:09.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:09.638 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:09.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:09.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:09.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:10.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:10.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:10.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:10.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:11.973 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:12.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:12.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:13.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:13.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:13.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:13.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:13.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:13.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:13.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:13.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:13.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:14.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:14.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:14.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:14.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:15.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:15.466 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:15.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:15.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:16.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:16.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:16.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:17.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:17.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:17.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:17.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:17.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:18.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:19.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:19.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:20.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:20.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:20.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:21.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:21.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:21.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:22.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:22.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:22.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:22.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:22.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:22.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:22.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:22.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:22.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:23.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:23.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:23.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:24.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:24.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:24.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:24.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:24.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:24.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:24.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:24.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:24.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:25.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:25.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:25.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:25.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:25.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:25.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:25.978 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:26.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:26.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:27.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:27.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:27.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:27.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:27.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:27.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:27.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:27.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:28.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:28.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:28.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:28.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:28.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:28.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:28.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:28.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:28.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:29.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:29.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:29.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:29.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:29.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:29.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:29.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:29.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:29.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:30.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:30.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:30.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:30.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:31.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:31.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:31.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:31.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:31.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:31.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:31.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:31.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:31.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:32.982 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:32.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:33.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:33.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:34.150 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:34.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:34.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:34.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:35.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:35.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:35.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:35.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:35.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:35.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:35.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:35.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:35.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:36.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:36.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:36.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:36.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:37.628 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:37.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:37.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:38.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:38.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:39.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:39.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:40.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:40.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:41.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:41.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:41.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:41.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:41.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:41.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:41.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:41.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:41.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:42.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:42.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:42.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:43.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:43.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:43.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:43.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:43.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:43.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:43.415 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:43.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:43.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:44.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:44.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:44.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:44.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:44.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:44.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:44.573 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:44.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:44.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:45.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:45.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:45.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:45.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:45.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:45.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:45.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:45.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:45.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:46.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:46.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:46.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:46.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:46.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:46.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:46.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:47.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:47.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:48.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:48.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:48.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:49.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:49.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:49.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:49.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:49.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:49.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:49.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:49.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:49.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:50.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:50.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:50.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:51.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:51.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:51.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:51.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:52.673 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:52.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:52.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:53.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:53.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:53.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:53.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:53.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:53.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:53.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:53.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:53.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:55.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:55.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:55.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:55.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:56.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:56.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:57.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:57.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:57.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:58.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:58.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:58.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:58.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:38:59.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:38:59.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:38:59.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:00.797 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:00.798 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:00.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:00.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:01.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:01.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:01.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:02.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:02.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:03.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:03.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:03.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:03.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:04.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:04.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:04.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:05.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:05.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:05.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:06.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:06.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:06.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:06.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:06.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:06.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:06.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:06.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:06.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:07.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:07.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:07.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:08.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:08.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:08.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:08.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:08.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:08.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:08.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:09.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:09.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:10.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:10.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:10.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:11.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:11.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:11.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:11.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:11.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:11.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:11.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:11.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:11.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:12.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:12.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:12.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:13.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:13.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:13.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:13.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:13.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:13.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:13.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:13.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:13.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:14.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:14.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:14.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:15.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:15.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:15.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:15.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:15.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:15.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:15.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:16.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:16.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:17.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:17.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:17.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:17.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:18.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:18.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:18.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:18.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:18.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:18.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:18.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:18.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:18.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:19.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:19.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:19.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:20.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:20.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:20.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:21.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:21.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:21.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:21.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:21.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:21.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:21.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:21.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:21.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:22.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:22.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:22.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:22.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:22.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:22.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:22.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:23.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:23.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:24.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:24.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:24.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:25.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:25.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:25.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:26.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:26.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:26.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:26.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:26.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:26.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:26.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:26.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:26.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:27.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:27.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:27.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:27.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:27.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:27.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:27.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:27.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:27.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:28.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:28.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:28.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:28.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:28.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:28.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:28.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:28.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:28.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:29.986 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:30.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:30.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:31.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:31.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:31.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:31.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:32.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:32.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:32.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:33.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:33.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:33.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:34.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:34.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:34.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:34.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:34.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:34.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:34.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:34.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:34.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:35.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:36.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:36.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:37.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:37.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:37.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:38.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:38.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:38.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:39.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:39.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:39.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:39.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:39.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:39.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:39.421 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:39.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:39.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:40.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:40.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:40.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:41.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:41.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:41.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:41.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:41.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:41.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:41.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:41.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:42.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:42.950 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:42.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:42.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:42.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:42.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:42.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:43.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:43.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:44.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:44.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:44.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:44.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:44.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:44.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:44.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:44.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:44.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:45.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:45.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:45.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:45.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:46.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:46.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:46.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:46.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:46.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:46.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:46.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:46.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:46.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:47.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:47.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:47.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:49.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:49.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:50.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:50.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:50.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:51.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:51.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:51.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:51.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:51.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:52.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:52.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:52.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:52.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:53.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:53.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:53.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:53.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:54.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:54.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:54.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:55.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:56.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:56.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:57.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:57.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:57.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:57.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:57.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:57.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:57.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:57.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:57.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:58.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:58.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:58.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:39:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:39:59.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:39:59.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:39:59.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:39:59.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:39:59.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:39:59.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:39:59.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:39:59.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:00.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:00.630 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:00.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:00.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:01.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:01.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:01.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:02.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:03.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:03.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:04.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:04.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:04.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:05.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:05.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:05.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:05.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:06.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:06.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:06.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:06.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:06.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:06.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:06.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:06.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:07.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:07.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:07.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:07.658 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:07.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:07.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:07.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:07.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:07.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:08.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:09.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-02-22T23:30:39.201173+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:09.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:40:10.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:40:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:40:08.721069+0000 '>' 2026-02-22T23:30:39.201173+0000 2026-03-08T23:40:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:40:10.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:405: _scrub_abort: perf_counters td/osd-scrub-test 3 2026-03-08T23:40:10.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:40:10.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:40:10.175 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:40:10.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:40:10.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:10.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:40:10.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.241 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.242 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.243 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:40:10.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.317 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 2, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 2, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 335.229496279, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 335.229496279 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 1, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 4.995994128, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 4.995994128 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 67, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 2, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 2, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.318 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.319 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:40:10.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:40:10.393 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:40:10.393 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.394 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.395 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:40:10.396 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:40:10.404 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_deep_scrub_abort ------------------ 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_deep_scrub_abort ------------------' 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:40:10.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:40:10.524 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:40:10.524 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:40:10.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:40:10.525 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:40:10.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:40:10.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:40:10.526 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:40:10.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:40:10.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:40:10.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:40:10.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:40:10.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:40:10.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:40:10.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:40:10.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:40:10.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_deep_scrub_abort ------------------ 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_deep_scrub_abort ------------------' 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_dump_scrub_schedule ------------------- 2026-03-08T23:40:10.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_dump_scrub_schedule -------------------' 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:40:10.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:40:10.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:40:10.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:40:10.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:40:10.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:40:10.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:40:10.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:40:10.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:40:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:40:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:40:10.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:40:10.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:40:10.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:40:10.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:40:10.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:40:10.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:40:10.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.603 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:40:10.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:40:10.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:40:10.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:40:10.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:40:10.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:40:10.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:40:10.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:40:10.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:40:10.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:40:10.607 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_dump_scrub_schedule ----------------------- 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_dump_scrub_schedule -----------------------' 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_dump_scrub_schedule td/osd-scrub-test 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:530: TEST_dump_scrub_schedule: local dir=td/osd-scrub-test 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:531: TEST_dump_scrub_schedule: local poolname=test 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:532: TEST_dump_scrub_schedule: local OSDS=3 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:533: TEST_dump_scrub_schedule: local objects=90 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:535: TEST_dump_scrub_schedule: TESTDATA=testdata.475827 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:537: TEST_dump_scrub_schedule: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:40:10.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.631 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:10.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:40:10.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:40:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:40:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:40:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:40:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:40:10.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:40:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:40:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:40:10.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:40:10.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:40:10.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:40:10.739 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:40:10.739 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.739 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.739 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:40:10.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:40:10.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:538: TEST_dump_scrub_schedule: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:40:10.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:10.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:40:10.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:40:10.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:549: TEST_dump_scrub_schedule: local 'ceph_osd_args=--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2' 2026-03-08T23:40:10.923 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:551: TEST_dump_scrub_schedule: expr 3 - 1 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:551: TEST_dump_scrub_schedule: seq 0 2 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:551: TEST_dump_scrub_schedule: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:553: TEST_dump_scrub_schedule: run_osd td/osd-scrub-test 0 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:40:10.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:40:10.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=326b691a-fe19-497a-bc86-0127c3ea006d 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 326b691a-fe19-497a-bc86-0127c3ea006d 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 326b691a-fe19-497a-bc86-0127c3ea006d' 2026-03-08T23:40:10.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:40:10.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDaCK5pSd7ZOBAAS9HErSUK4jjslFBKp2Azpw== 2026-03-08T23:40:10.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDaCK5pSd7ZOBAAS9HErSUK4jjslFBKp2Azpw=="}' 2026-03-08T23:40:10.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 326b691a-fe19-497a-bc86-0127c3ea006d -i td/osd-scrub-test/0/new.json 2026-03-08T23:40:11.045 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:40:11.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:40:11.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 --mkfs --key AQDaCK5pSd7ZOBAAS9HErSUK4jjslFBKp2Azpw== --osd-uuid 326b691a-fe19-497a-bc86-0127c3ea006d 2026-03-08T23:40:11.079 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:11.086+0000 7f52816ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:11.089 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:11.094+0000 7f52816ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:11.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:11.094+0000 7f52816ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:11.090 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:11.094+0000 7f52816ea8c0 -1 bdev(0x5649ea989c00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:40:11.091 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:11.094+0000 7f52816ea8c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:40:13.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:40:13.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:40:13.400 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:40:13.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:40:13.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:40:13.534 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:40:13.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:40:13.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:13.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:40:13.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:40:13.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:40:13.560 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:13.566+0000 7fb6971db8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:13.561 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:13.566+0000 7fb6971db8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:13.563 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:13.570+0000 7fb6971db8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:13.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:40:13.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:14.535 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:14.542+0000 7fb6971db8c0 -1 Falling back to public interface 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:14.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:40:15.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:15.512 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:15.518+0000 7fb6971db8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:16.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/278630101,v1:127.0.0.1:6803/278630101] [v2:127.0.0.1:6804/278630101,v1:127.0.0.1:6805/278630101] exists,up 326b691a-fe19-497a-bc86-0127c3ea006d 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:551: TEST_dump_scrub_schedule: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:553: TEST_dump_scrub_schedule: run_osd td/osd-scrub-test 1 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:40:16.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:16.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2' 2026-03-08T23:40:16.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:40:16.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:40:16.204 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 0d91a7e7-40e5-4829-a482-03513b2e5a69 2026-03-08T23:40:16.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0d91a7e7-40e5-4829-a482-03513b2e5a69 2026-03-08T23:40:16.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 0d91a7e7-40e5-4829-a482-03513b2e5a69' 2026-03-08T23:40:16.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:40:16.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDgCK5p8KBWDRAAWFnBkdPhiP7tkBHn6dipuQ== 2026-03-08T23:40:16.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDgCK5p8KBWDRAAWFnBkdPhiP7tkBHn6dipuQ=="}' 2026-03-08T23:40:16.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0d91a7e7-40e5-4829-a482-03513b2e5a69 -i td/osd-scrub-test/1/new.json 2026-03-08T23:40:16.368 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:40:16.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:40:16.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 --mkfs --key AQDgCK5p8KBWDRAAWFnBkdPhiP7tkBHn6dipuQ== --osd-uuid 0d91a7e7-40e5-4829-a482-03513b2e5a69 2026-03-08T23:40:16.397 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:16.402+0000 7f88de95d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:16.399 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:16.406+0000 7f88de95d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:16.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:16.406+0000 7f88de95d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:16.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:16.406+0000 7f88de95d8c0 -1 bdev(0x562e91c9dc00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:40:16.400 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:16.406+0000 7f88de95d8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:40:18.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:40:18.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:40:18.664 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:40:18.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:40:18.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:40:18.866 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:40:18.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:40:18.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:18.866 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:40:18.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:40:18.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:40:18.882 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:18.886+0000 7f0e61a798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:18.883 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:18.890+0000 7f0e61a798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:18.884 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:18.890+0000 7f0e61a798c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:19.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:40:19.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:20.087 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:20.094+0000 7f0e61a798c0 -1 Falling back to public interface 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:20.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:40:20.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:21.039 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.046+0000 7f0e61a798c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:21.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:40:21.532 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3267377638,v1:127.0.0.1:6811/3267377638] [v2:127.0.0.1:6812/3267377638,v1:127.0.0.1:6813/3267377638] exists,up 0d91a7e7-40e5-4829-a482-03513b2e5a69 2026-03-08T23:40:21.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:40:21.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:551: TEST_dump_scrub_schedule: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:553: TEST_dump_scrub_schedule: run_osd td/osd-scrub-test 2 --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:21.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2' 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:40:21.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:40:21.535 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 39832d90-f7e4-4bef-9d73-2daec9f5e3e2 2026-03-08T23:40:21.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=39832d90-f7e4-4bef-9d73-2daec9f5e3e2 2026-03-08T23:40:21.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 39832d90-f7e4-4bef-9d73-2daec9f5e3e2' 2026-03-08T23:40:21.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:40:21.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDlCK5p6iweIRAADaJDHNZAjEk+MZBimXFXSQ== 2026-03-08T23:40:21.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDlCK5p6iweIRAADaJDHNZAjEk+MZBimXFXSQ=="}' 2026-03-08T23:40:21.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 39832d90-f7e4-4bef-9d73-2daec9f5e3e2 -i td/osd-scrub-test/2/new.json 2026-03-08T23:40:21.700 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:40:21.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:40:21.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 --mkfs --key AQDlCK5p6iweIRAADaJDHNZAjEk+MZBimXFXSQ== --osd-uuid 39832d90-f7e4-4bef-9d73-2daec9f5e3e2 2026-03-08T23:40:21.732 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.738+0000 7fa1296e28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:21.734 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.738+0000 7fa1296e28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:21.735 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.742+0000 7fa1296e28c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:21.735 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.742+0000 7fa1296e28c0 -1 bdev(0x562feae19c00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:40:21.735 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:21.742+0000 7fa1296e28c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:40:23.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:40:23.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:40:23.996 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:40:23.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:40:23.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:40:24.199 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:40:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:40:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_deep_scrub_randomize_ratio=0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_backoff_ratio=0.0 --osd_op_queue=wpq --osd_stats_update_period_not_scrubbing=1 --osd_stats_update_period_scrubbing=1 --osd_scrub_sleep=0.2 2026-03-08T23:40:24.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:40:24.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:40:24.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:40:24.215 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:24.222+0000 7f7256e888c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:24.220 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:24.226+0000 7f7256e888c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:24.221 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:24.226+0000 7f7256e888c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:24.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:40:24.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:25.415 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:25.422+0000 7f7256e888c0 -1 Falling back to public interface 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:25.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:40:25.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:40:26.513 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:40:26.518+0000 7f7256e888c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:40:26.717 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:40:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:40:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:40:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:40:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:40:26.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:40:26.892 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2971560626,v1:127.0.0.1:6819/2971560626] [v2:127.0.0.1:6820/2971560626,v1:127.0.0.1:6821/2971560626] exists,up 39832d90-f7e4-4bef-9d73-2daec9f5e3e2 2026-03-08T23:40:26.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:40:26.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:40:26.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:40:26.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:557: TEST_dump_scrub_schedule: create_pool test 1 1 2026-03-08T23:40:26.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:40:27.100 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:40:27.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:40:28.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:558: TEST_dump_scrub_schedule: wait_for_clean 2026-03-08T23:40:28.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:40:28.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:40:28.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:40:28.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:40:28.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:40:28.118 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:40:28.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:40:28.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:40:28.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:40:28.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:40:28.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:40:28.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836493 2026-03-08T23:40:28.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836493 2026-03-08T23:40:28.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493' 2026-03-08T23:40:28.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:40:28.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:40:28.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672968 2026-03-08T23:40:28.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672968 2026-03-08T23:40:28.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-42949672968' 2026-03-08T23:40:28.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:40:28.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542146 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542146 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-42949672968 2-60129542146' 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836493 2026-03-08T23:40:28.596 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:40:28.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:40:28.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836493 2026-03-08T23:40:28.598 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:40:28.599 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836493 2026-03-08T23:40:28.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836493 2026-03-08T23:40:28.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836493' 2026-03-08T23:40:28.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:40:28.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836491 -lt 21474836493 2026-03-08T23:40:28.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:40:29.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:40:29.762 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:40:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836493 2026-03-08T23:40:29.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:40:29.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672968 2026-03-08T23:40:29.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:40:29.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:40:29.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672968 2026-03-08T23:40:29.925 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:40:29.926 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672968 2026-03-08T23:40:29.926 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672968 2026-03-08T23:40:29.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672968' 2026-03-08T23:40:29.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:40:30.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672969 -lt 42949672968 2026-03-08T23:40:30.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:40:30.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542146 2026-03-08T23:40:30.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:40:30.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:40:30.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542146 2026-03-08T23:40:30.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:40:30.090 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542146 2026-03-08T23:40:30.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542146 2026-03-08T23:40:30.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542146' 2026-03-08T23:40:30.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:40:30.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542146 2026-03-08T23:40:30.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:40:30.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:40:30.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:40:30.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:40:30.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:40:30.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:40:30.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:40:30.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:40:30.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:40:30.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:40:30.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:40:30.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:40:30.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:40:30.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:559: TEST_dump_scrub_schedule: ceph osd dump 2026-03-08T23:40:30.816 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:559: TEST_dump_scrub_schedule: awk '{ print $2 }' 2026-03-08T23:40:30.817 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:559: TEST_dump_scrub_schedule: grep '^pool.*['\'']test['\'']' 2026-03-08T23:40:30.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:559: TEST_dump_scrub_schedule: poolid=1 2026-03-08T23:40:30.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:561: TEST_dump_scrub_schedule: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:40:30.985 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:40:30.985 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:40:30.985 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 4.3512e-05 s, 23.7 MB/s 2026-03-08T23:40:30.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: seq 1 90 2026-03-08T23:40:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:30.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj1 testdata.475827 2026-03-08T23:40:31.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj2 testdata.475827 2026-03-08T23:40:31.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj3 testdata.475827 2026-03-08T23:40:31.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj4 testdata.475827 2026-03-08T23:40:31.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj5 testdata.475827 2026-03-08T23:40:31.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj6 testdata.475827 2026-03-08T23:40:31.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj7 testdata.475827 2026-03-08T23:40:31.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj8 testdata.475827 2026-03-08T23:40:31.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj9 testdata.475827 2026-03-08T23:40:31.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj10 testdata.475827 2026-03-08T23:40:31.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj11 testdata.475827 2026-03-08T23:40:31.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj12 testdata.475827 2026-03-08T23:40:31.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj13 testdata.475827 2026-03-08T23:40:31.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj14 testdata.475827 2026-03-08T23:40:31.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj15 testdata.475827 2026-03-08T23:40:31.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj16 testdata.475827 2026-03-08T23:40:31.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj17 testdata.475827 2026-03-08T23:40:31.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj18 testdata.475827 2026-03-08T23:40:31.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj19 testdata.475827 2026-03-08T23:40:31.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj20 testdata.475827 2026-03-08T23:40:31.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj21 testdata.475827 2026-03-08T23:40:31.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj22 testdata.475827 2026-03-08T23:40:31.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj23 testdata.475827 2026-03-08T23:40:31.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj24 testdata.475827 2026-03-08T23:40:31.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj25 testdata.475827 2026-03-08T23:40:31.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj26 testdata.475827 2026-03-08T23:40:31.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj27 testdata.475827 2026-03-08T23:40:31.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj28 testdata.475827 2026-03-08T23:40:31.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.561 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj29 testdata.475827 2026-03-08T23:40:31.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.582 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj30 testdata.475827 2026-03-08T23:40:31.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj31 testdata.475827 2026-03-08T23:40:31.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj32 testdata.475827 2026-03-08T23:40:31.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj33 testdata.475827 2026-03-08T23:40:31.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj34 testdata.475827 2026-03-08T23:40:31.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj35 testdata.475827 2026-03-08T23:40:31.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj36 testdata.475827 2026-03-08T23:40:31.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj37 testdata.475827 2026-03-08T23:40:31.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj38 testdata.475827 2026-03-08T23:40:31.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj39 testdata.475827 2026-03-08T23:40:31.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj40 testdata.475827 2026-03-08T23:40:31.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj41 testdata.475827 2026-03-08T23:40:31.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj42 testdata.475827 2026-03-08T23:40:31.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj43 testdata.475827 2026-03-08T23:40:31.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.943 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj44 testdata.475827 2026-03-08T23:40:31.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj45 testdata.475827 2026-03-08T23:40:31.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:31.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj46 testdata.475827 2026-03-08T23:40:32.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj47 testdata.475827 2026-03-08T23:40:32.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj48 testdata.475827 2026-03-08T23:40:32.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj49 testdata.475827 2026-03-08T23:40:32.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj50 testdata.475827 2026-03-08T23:40:32.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj51 testdata.475827 2026-03-08T23:40:32.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj52 testdata.475827 2026-03-08T23:40:32.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj53 testdata.475827 2026-03-08T23:40:32.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj54 testdata.475827 2026-03-08T23:40:32.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj55 testdata.475827 2026-03-08T23:40:32.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj56 testdata.475827 2026-03-08T23:40:32.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj57 testdata.475827 2026-03-08T23:40:32.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj58 testdata.475827 2026-03-08T23:40:32.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj59 testdata.475827 2026-03-08T23:40:32.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj60 testdata.475827 2026-03-08T23:40:32.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj61 testdata.475827 2026-03-08T23:40:32.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj62 testdata.475827 2026-03-08T23:40:32.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj63 testdata.475827 2026-03-08T23:40:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj64 testdata.475827 2026-03-08T23:40:32.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj65 testdata.475827 2026-03-08T23:40:32.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj66 testdata.475827 2026-03-08T23:40:32.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj67 testdata.475827 2026-03-08T23:40:32.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj68 testdata.475827 2026-03-08T23:40:32.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj69 testdata.475827 2026-03-08T23:40:32.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj70 testdata.475827 2026-03-08T23:40:32.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj71 testdata.475827 2026-03-08T23:40:32.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj72 testdata.475827 2026-03-08T23:40:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.525 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj73 testdata.475827 2026-03-08T23:40:32.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj74 testdata.475827 2026-03-08T23:40:32.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj75 testdata.475827 2026-03-08T23:40:32.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj76 testdata.475827 2026-03-08T23:40:32.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj77 testdata.475827 2026-03-08T23:40:32.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj78 testdata.475827 2026-03-08T23:40:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj79 testdata.475827 2026-03-08T23:40:32.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj80 testdata.475827 2026-03-08T23:40:32.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj81 testdata.475827 2026-03-08T23:40:32.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj82 testdata.475827 2026-03-08T23:40:32.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj83 testdata.475827 2026-03-08T23:40:32.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj84 testdata.475827 2026-03-08T23:40:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj85 testdata.475827 2026-03-08T23:40:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj86 testdata.475827 2026-03-08T23:40:32.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj87 testdata.475827 2026-03-08T23:40:32.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj88 testdata.475827 2026-03-08T23:40:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj89 testdata.475827 2026-03-08T23:40:32.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:562: TEST_dump_scrub_schedule: for i in `seq 1 $objects` 2026-03-08T23:40:32.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:564: TEST_dump_scrub_schedule: rados -p test put obj90 testdata.475827 2026-03-08T23:40:32.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:566: TEST_dump_scrub_schedule: rm -f testdata.475827 2026-03-08T23:40:32.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:568: TEST_dump_scrub_schedule: local pgid=1.0 2026-03-08T23:40:32.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:570: TEST_dump_scrub_schedule: date +%Y-%m-%dT%H:%M:%S.%N%:z 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:570: TEST_dump_scrub_schedule: local now_is=2026-03-08T23:40:32.916822999+00:00 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:578: TEST_dump_scrub_schedule: expct_starting=(['query_active']='false' ['query_is_future']='true' ['query_schedule']='scrub scheduled') 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:578: TEST_dump_scrub_schedule: declare -A expct_starting 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:579: TEST_dump_scrub_schedule: declare -A sched_data 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:580: TEST_dump_scrub_schedule: extract_published_sch 1.0 2026-03-08T23:40:32.916822999+00:00 2019-10-12T20:32:43.645168+0000 sched_data 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:17: extract_published_sch: local pgn=1.0 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:18: extract_published_sch: local -n dict=sched_data 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:19: extract_published_sch: local current_time=2026-03-08T23:40:32.916822999+00:00 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:20: extract_published_sch: local extra_time=2019-10-12T20:32:43.645168+0000 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:21: extract_published_sch: local extr_dbg=2 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:24: extract_published_sch: local saved_echo_flag=x 2026-03-08T23:40:32.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:25: extract_published_sch: set +x 2026-03-08T23:40:33.058 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 0, 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-09T23:40:27.108058+0000", 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 18, 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 30 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:33.071 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-09T23:40:27.108058+0000", 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-09T23:40:27.108 (2026-03-09T23:40:27.108)" 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:33.149 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=18 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=112 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-09T23:40:27.108' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-09T23:40:27.108' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=0 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-08T23:40:27.108058+0000' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='0x0' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=true 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=0 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-09T23:40:27.108058+0000' 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=18 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=30 2026-03-08T23:40:33.238 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:581: TEST_dump_scrub_schedule: schedule_against_expected sched_data expct_starting initial 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:178: schedule_against_expected: local -n dict=sched_data 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:179: schedule_against_expected: local -n ep=expct_starting 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:180: schedule_against_expected: local extr_dbg=1 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:183: schedule_against_expected: local saved_echo_flag=x 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:184: schedule_against_expected: set +x 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stdout:-- - comparing: 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stdout:key is query_schedule expected: scrub scheduled in actual: scrub scheduled 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stdout:key is query_is_future expected: true in actual: true 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active expected: false in actual: false 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:203: schedule_against_expected: return 0 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:582: TEST_dump_scrub_schedule: (( 0 == 0 )) 2026-03-08T23:40:33.250 INFO:tasks.workunit.client.0.vm03.stdout:last-scrub --- 0x0 2026-03-08T23:40:33.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:583: TEST_dump_scrub_schedule: echo 'last-scrub --- ' 0x0 2026-03-08T23:40:33.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:589: TEST_dump_scrub_schedule: saved_last_stamp=2026-03-08T23:40:27.108058+0000 2026-03-08T23:40:33.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:590: TEST_dump_scrub_schedule: ceph tell 'osd.*' config set osd_scrub_sleep 0 2026-03-08T23:40:33.317 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:40:33.317 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:33.317 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:33.324 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:40:33.324 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:33.324 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:33.331 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T23:40:33.331 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:33.331 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:33.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:591: TEST_dump_scrub_schedule: ceph tell 1.0 deep-scrub 2026-03-08T23:40:33.414 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:40:33.414 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:40:33.414 INFO:tasks.workunit.client.0.vm03.stdout: "must": true, 2026-03-08T23:40:33.414 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "0.000000" 2026-03-08T23:40:33.414 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:33.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:596: TEST_dump_scrub_schedule: sleep 5 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:597: TEST_dump_scrub_schedule: sched_data=() 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:598: TEST_dump_scrub_schedule: expct_qry_duration=(['query_last_duration']='0' ['query_last_duration_neg']='not0') 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:598: TEST_dump_scrub_schedule: declare -A expct_qry_duration 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:599: TEST_dump_scrub_schedule: wait_any_cond 1.0 10 2026-03-08T23:40:27.108058+0000 expct_qry_duration 'WaitingAfterScrub ' sched_data 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.0 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-08T23:40:27.108058+0000 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=expct_qry_duration 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sched_data 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:40:38.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:40:38.427 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (WaitingAfterScrub ): pg:1.0 dt:2026-03-08T23:40:27.108058+0000 (10 retries) 2026-03-08T23:40:39.093 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-09T23:40:34.314854+0000", 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:39.106 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:39.107 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 18, 2026-03-08T23:40:39.107 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 130 2026-03-08T23:40:39.107 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:39.107 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:39.188 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-09T23:40:34.314854+0000", 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-09T23:40:34.314 (2026-03-09T23:40:34.314)" 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:39.189 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=18 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=132 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-09T23:40:34.314' 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-09T23:40:34.314' 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:39.282 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-08T23:40:34.314854+0000' 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=true 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-09T23:40:34.314854+0000' 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=18 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=130 2026-03-08T23:40:39.283 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ false / 132 / 130 / true / 2026-03-08T23:40:34.314854+0000 / scrub scheduled %%% query_last_duration_neg query_last_duration 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stdout:key is query_last_duration: negation:1 # expected: 0 # in actual: 1 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stdout:WaitingAfterScrub - 'query_last_duration' actual value (1) matches expected (0) (negation: 1) 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:601: TEST_dump_scrub_schedule: sched_data=() 2026-03-08T23:40:39.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:602: TEST_dump_scrub_schedule: expct_dmp_duration=(['dmp_last_duration']='0' ['dmp_last_duration_neg']='not0') 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:602: TEST_dump_scrub_schedule: declare -A expct_dmp_duration 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:603: TEST_dump_scrub_schedule: wait_any_cond 1.0 10 2026-03-08T23:40:27.108058+0000 expct_dmp_duration 'WaitingAfterScrub_dmp ' sched_data 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.0 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-08T23:40:27.108058+0000 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=expct_dmp_duration 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sched_data 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:40:39.294 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (WaitingAfterScrub_dmp ): pg:1.0 dt:2026-03-08T23:40:27.108058+0000 (10 retries) 2026-03-08T23:40:39.947 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-09T23:40:34.314854+0000", 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 18, 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 131 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:39.959 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-09T23:40:34.314854+0000", 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-09T23:40:34.314 (2026-03-09T23:40:34.314)" 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:40.054 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=18 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=132 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-09T23:40:34.314' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-09T23:40:34.314' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-08T23:40:34.314854+0000' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=true 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:40.210 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-09T23:40:34.314854+0000' 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=18 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=131 2026-03-08T23:40:40.211 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:40.221 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ false / 132 / 131 / true / 2026-03-08T23:40:34.314854+0000 / scrub scheduled %%% dmp_last_duration dmp_last_duration_neg 2026-03-08T23:40:40.230 INFO:tasks.workunit.client.0.vm03.stdout:key is dmp_last_duration: negation:1 # expected: 0 # in actual: 1 2026-03-08T23:40:40.230 INFO:tasks.workunit.client.0.vm03.stdout:WaitingAfterScrub_dmp - 'dmp_last_duration' actual value (1) matches expected (0) (negation: 1) 2026-03-08T23:40:40.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:40:40.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:610: TEST_dump_scrub_schedule: ceph osd set noscrub 2026-03-08T23:40:40.429 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:40:40.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:611: TEST_dump_scrub_schedule: sleep 2 2026-03-08T23:40:42.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:612: TEST_dump_scrub_schedule: ceph tell 'osd.*' config set osd_shallow_scrub_chunk_max 3 2026-03-08T23:40:42.516 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:40:42.517 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) " 2026-03-08T23:40:42.517 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.523 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:40:42.523 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) " 2026-03-08T23:40:42.523 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.530 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T23:40:42.530 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_shallow_scrub_chunk_max = '' (not observed, change may require restart) " 2026-03-08T23:40:42.530 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:613: TEST_dump_scrub_schedule: ceph tell 'osd.*' config set osd_scrub_sleep 2.0 2026-03-08T23:40:42.611 INFO:tasks.workunit.client.0.vm03.stdout:osd.0: { 2026-03-08T23:40:42.611 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:42.611 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.618 INFO:tasks.workunit.client.0.vm03.stdout:osd.1: { 2026-03-08T23:40:42.619 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:42.619 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.625 INFO:tasks.workunit.client.0.vm03.stdout:osd.2: { 2026-03-08T23:40:42.626 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_sleep = '' " 2026-03-08T23:40:42.626 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:42.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:614: TEST_dump_scrub_schedule: sleep 8 2026-03-08T23:40:50.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:615: TEST_dump_scrub_schedule: saved_last_stamp=2026-03-08T23:40:34.314854+0000 2026-03-08T23:40:50.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:617: TEST_dump_scrub_schedule: ceph tell 1.0 schedule-scrub 2026-03-08T23:40:50.711 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:40:50.711 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:40:50.711 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:40:50.711 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:39:10.721018+0000" 2026-03-08T23:40:50.711 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:50.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:618: TEST_dump_scrub_schedule: sleep 1 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:619: TEST_dump_scrub_schedule: sched_data=() 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:621: TEST_dump_scrub_schedule: expct_scrub_peri_sched=(['query_is_future']='false') 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:621: TEST_dump_scrub_schedule: declare -A expct_scrub_peri_sched 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:622: TEST_dump_scrub_schedule: wait_any_cond 1.0 10 2026-03-08T23:40:34.314854+0000 expct_scrub_peri_sched waitingBeingScheduled sched_data 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.0 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-08T23:40:34.314854+0000 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=expct_scrub_peri_sched 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sched_data 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:40:51.724 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (waitingBeingScheduled): pg:1.0 dt:2026-03-08T23:40:34.314854+0000 (10 retries) 2026-03-08T23:40:52.380 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "queued for scrub", 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "0", 2026-03-08T23:40:52.393 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": false, 2026-03-08T23:40:52.394 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": false, 2026-03-08T23:40:52.394 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 19, 2026-03-08T23:40:52.394 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 146 2026-03-08T23:40:52.394 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:52.394 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-08T23:40:56.389 (2026-03-02T23:39:10.721)" 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:52.473 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=19 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=148 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:40:56.389' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-02T23:39:10.721' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='queued for scrub' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='0' 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=false 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=false 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=19 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=146 2026-03-08T23:40:52.569 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:52.579 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ false / 148 / 146 / false / 2026-03-01T23:39:10.721018+0000 / scrub scheduled %%% query_is_future 2026-03-08T23:40:52.580 INFO:tasks.workunit.client.0.vm03.stdout:key is query_is_future: negation:0 # expected: false # in actual: false 2026-03-08T23:40:52.580 INFO:tasks.workunit.client.0.vm03.stdout:waitingBeingScheduled - 'query_is_future' actual value (false) matches expected (false) (negation: 0) 2026-03-08T23:40:52.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:40:52.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:634: TEST_dump_scrub_schedule: saved_last_stamp=2026-03-01T23:39:10.721018+0000 2026-03-08T23:40:52.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:635: TEST_dump_scrub_schedule: ceph osd unset noscrub 2026-03-08T23:40:52.794 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:637: TEST_dump_scrub_schedule: cond_active=(['query_active']='true') 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:637: TEST_dump_scrub_schedule: declare -A cond_active 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:638: TEST_dump_scrub_schedule: sched_data=() 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:639: TEST_dump_scrub_schedule: wait_any_cond 1.0 10 2026-03-01T23:39:10.721018+0000 cond_active 'WaitingActive ' sched_data 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.0 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-01T23:39:10.721018+0000 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=cond_active 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sched_data 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:40:52.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:40:52.812 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (WaitingActive ): pg:1.0 dt:2026-03-01T23:39:10.721018+0000 (10 retries) 2026-03-08T23:40:53.468 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:53.480 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:53.480 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 19, 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 147 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:53.481 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:53.560 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-08T23:40:56.389 (2026-03-02T23:39:10.721)" 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:53.561 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=151 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:40:56.389' 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-02T23:39:10.721' 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:53.652 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-08T23:40:56.389275+0000' 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=19 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=147 2026-03-08T23:40:53.653 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:53.664 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ false / 151 / 147 / false / 2026-03-01T23:39:10.721018+0000 / scrub scheduled %%% query_active 2026-03-08T23:40:53.664 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active: negation:0 # expected: true # in actual: false 2026-03-08T23:40:54.319 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 20, 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 150 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:54.332 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-08T23:40:56.389 (2026-03-02T23:39:10.721)" 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:54.411 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:54.503 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=152 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:40:56.389' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-02T23:39:10.721' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-08T23:40:56.389275+0000' 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=20 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=150 2026-03-08T23:40:54.504 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:54.518 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 2 ~ false / 152 / 150 / false / 2026-03-01T23:39:10.721018+0000 / scrub scheduled %%% query_active 2026-03-08T23:40:54.518 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active: negation:0 # expected: true # in actual: false 2026-03-08T23:40:55.173 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:55.185 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 20, 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 150 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:55.186 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-08T23:40:56.389 (2026-03-02T23:39:10.721)" 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:55.267 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=153 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:40:56.389' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-02T23:39:10.721' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-08T23:40:56.389275+0000' 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=20 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=150 2026-03-08T23:40:55.358 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:55.369 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 3 ~ false / 153 / 150 / false / 2026-03-01T23:39:10.721018+0000 / scrub scheduled %%% query_active 2026-03-08T23:40:55.369 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active: negation:0 # expected: true # in actual: false 2026-03-08T23:40:56.034 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:56.046 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 20, 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 152 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:56.047 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:56.128 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:56.128 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-08T23:40:56.389 (2026-03-02T23:39:10.721)" 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:56.129 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=153 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:40:56.389' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-02T23:39:10.721' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-08T23:40:56.389275+0000' 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=20 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=152 2026-03-08T23:40:56.214 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:56.224 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 4 ~ false / 153 / 152 / false / 2026-03-01T23:39:10.721018+0000 / scrub scheduled %%% query_active 2026-03-08T23:40:56.224 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active: negation:0 # expected: true # in actual: false 2026-03-08T23:40:56.879 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-08T23:40:56.389275+0000", 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 20, 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 152 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:56.893 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "active": true, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_start": "15", 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "start": "1:00000000::::0", 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "end": "MIN", 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "max_end": "MIN", 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "subset_last_update": "0'0", 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "req_scrub": false, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "auto_repair": false, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "check_repair": false, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "deep_scrub_on_error": false, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 5, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "shallow_errors": 0, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "deep_errors": 0, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "fixed": 0, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "waiting_on_whom": [], 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrubbing" 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:56.972 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=157 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=true 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrubbing' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='0' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='0' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=false 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-08T23:40:56.389275+0000' 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=20 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=152 2026-03-08T23:40:57.063 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 5 ~ true / 157 / 152 / false / 2026-03-01T23:39:10.721018+0000 / scrubbing %%% query_active 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stdout:key is query_active: negation:0 # expected: true # in actual: true 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stdout:WaitingActive - 'query_active' actual value (true) matches expected (true) (negation: 0) 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:643: TEST_dump_scrub_schedule: cond_active_dmp=(['dmp_state_has_scrubbing']='true' ['query_active']='false') 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:643: TEST_dump_scrub_schedule: declare -A cond_active_dmp 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:644: TEST_dump_scrub_schedule: sched_data=() 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:645: TEST_dump_scrub_schedule: wait_any_cond 1.0 10 2026-03-01T23:39:10.721018+0000 cond_active_dmp 'WaitingActive ' sched_data 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.0 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-01T23:39:10.721018+0000 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=cond_active_dmp 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sched_data 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:40:57.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:40:57.074 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (WaitingActive ): pg:1.0 dt:2026-03-01T23:39:10.721018+0000 (10 retries) 2026-03-08T23:40:57.725 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean+scrubbing", 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": true, 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "scrubbing for 1s", 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "0", 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": false, 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": false, 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 20, 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 157 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:40:57.737 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "active": true, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "epoch_start": "15", 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "start": "1:00000000::::0", 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "end": "MIN", 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "max_end": "MIN", 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "subset_last_update": "0'0", 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "req_scrub": false, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "auto_repair": false, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "check_repair": false, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "deep_scrub_on_error": false, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "priority": 5, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "shallow_errors": 0, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "deep_errors": 0, 2026-03-08T23:40:57.816 INFO:tasks.workunit.client.0.vm03.stdout: "fixed": 0, 2026-03-08T23:40:57.817 INFO:tasks.workunit.client.0.vm03.stdout: "waiting_on_whom": [], 2026-03-08T23:40:57.817 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrubbing" 2026-03-08T23:40:57.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:40:57.817 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=20 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=158 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=true 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrubbing' 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='0' 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='0' 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:40:57.906 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-01T23:39:10.721018+0000' 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='18x90' 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=false 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=null 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean+scrubbing' 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=true 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='scrubbing for 1s' 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='0' 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=false 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=false 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=20 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=157 2026-03-08T23:40:57.907 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:40:57.917 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ true / 158 / 157 / false / 2026-03-01T23:39:10.721018+0000 / scrubbing %%% dmp_state_has_scrubbing query_active 2026-03-08T23:40:57.917 INFO:tasks.workunit.client.0.vm03.stdout:key is dmp_state_has_scrubbing: negation:0 # expected: true # in actual: true 2026-03-08T23:40:57.917 INFO:tasks.workunit.client.0.vm03.stdout:WaitingActive - 'dmp_state_has_scrubbing' actual value (true) matches expected (true) (negation: 0) 2026-03-08T23:40:57.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:40:57.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:646: TEST_dump_scrub_schedule: sleep 4 2026-03-08T23:41:01.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:647: TEST_dump_scrub_schedule: perf_counters td/osd-scrub-test 3 2026-03-08T23:41:01.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:41:01.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:41:01.919 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:41:01.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:41:01.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:41:01.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:41:01.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.995 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:01.996 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:01.997 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:01.998 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.007 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.007 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:41:02.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.085 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 1, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 1, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.012000083, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.012000083 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 7, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 1, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:02.086 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 1, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 1, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 4.004013323, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 4.004013323 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.087 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 2, 2026-03-08T23:41:02.088 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.088 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.088 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.088 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 1, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:41:02.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.166 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.167 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:41:02.168 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:41:02.169 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_dump_scrub_schedule ------------------ 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_dump_scrub_schedule ------------------' 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:41:02.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:41:02.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:41:02.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:41:02.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:41:02.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:41:02.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:41:02.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:41:02.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:41:02.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:41:02.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:41:02.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:41:02.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:41:02.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:41:02.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:41:02.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:41:02.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:41:02.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.306 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_dump_scrub_schedule ------------------ 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_interval_changes ------------------- 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_dump_scrub_schedule ------------------' 2026-03-08T23:41:02.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_interval_changes -------------------' 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:41:02.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:41:02.309 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:41:02.309 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:41:02.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:41:02.310 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:41:02.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:41:02.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:41:02.311 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:41:02.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:41:02.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:41:02.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:41:02.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:41:02.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:41:02.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:41:02.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:41:02.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:41:02.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:41:02.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:41:02.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:41:02.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:41:02.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:41:02.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_interval_changes ----------------------- 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_interval_changes -----------------------' 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_interval_changes td/osd-scrub-test 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:149: TEST_interval_changes: local poolname=test 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:150: TEST_interval_changes: local OSDS=2 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:151: TEST_interval_changes: local objects=10 2026-03-08T23:41:02.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:153: TEST_interval_changes: expr 24 '*' 60 '*' 60 2026-03-08T23:41:02.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:153: TEST_interval_changes: local day=86400 2026-03-08T23:41:02.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:154: TEST_interval_changes: expr 86400 '*' 7 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:154: TEST_interval_changes: local week=604800 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:155: TEST_interval_changes: local min_interval=86400 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:156: TEST_interval_changes: local max_interval=604800 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:157: TEST_interval_changes: local WAIT_FOR_UPDATE=15 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:159: TEST_interval_changes: TESTDATA=testdata.475827 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:162: TEST_interval_changes: run_mon td/osd-scrub-test a --osd_pool_default_size=2 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:41:02.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=2 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:02.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=2 2026-03-08T23:41:02.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:41:02.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:41:02.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:41:02.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:41:02.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:41:02.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:41:02.369 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:41:02.370 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:41:02.370 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:41:02.370 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:02.370 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.370 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.371 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:41:02.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:41:02.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:41:02.435 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:41:02.436 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:41:02.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:163: TEST_interval_changes: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:41:02.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:02.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:41:02.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:41:02.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:164: TEST_interval_changes: expr 2 - 1 2026-03-08T23:41:02.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:164: TEST_interval_changes: seq 0 1 2026-03-08T23:41:02.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:164: TEST_interval_changes: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:41:02.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:166: TEST_interval_changes: run_osd td/osd-scrub-test 0 --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 2026-03-08T23:41:02.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:02.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0' 2026-03-08T23:41:02.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:41:02.650 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:41:02.651 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921 2026-03-08T23:41:02.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921 2026-03-08T23:41:02.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921' 2026-03-08T23:41:02.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:41:02.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAOCa5p5HACKBAA4/IzeQtCol3LjWkoSfufsA== 2026-03-08T23:41:02.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAOCa5p5HACKBAA4/IzeQtCol3LjWkoSfufsA=="}' 2026-03-08T23:41:02.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921 -i td/osd-scrub-test/0/new.json 2026-03-08T23:41:02.762 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:41:02.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:41:02.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 --mkfs --key AQAOCa5p5HACKBAA4/IzeQtCol3LjWkoSfufsA== --osd-uuid e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921 2026-03-08T23:41:02.793 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:02.798+0000 7f28565e68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:02.795 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:02.802+0000 7f28565e68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:02.797 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:02.802+0000 7f28565e68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:02.797 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:02.802+0000 7f28565e68c0 -1 bdev(0x5568e8d2ec00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:41:02.797 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:02.802+0000 7f28565e68c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:41:05.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:41:05.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:41:05.056 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:41:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:41:05.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:41:05.158 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:41:05.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:41:05.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 2026-03-08T23:41:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:41:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:41:05.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:41:05.197 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:05.198+0000 7f9c7f9d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:05.207 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:05.214+0000 7f9c7f9d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:05.222 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:05.222+0000 7f9c7f9d88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:05.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:41:05.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:05.659 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:05.666+0000 7f9c7f9d88c0 -1 Falling back to public interface 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:06.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:41:06.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:06.857 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:06.862+0000 7f9c7f9d88c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:41:07.654 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:41:07.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:07.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:07.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:41:07.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:07.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:41:07.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:08.896 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:41:08.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:08.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:08.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:41:08.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:08.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:41:09.073 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/21041226,v1:127.0.0.1:6803/21041226] [v2:127.0.0.1:6804/21041226,v1:127.0.0.1:6805/21041226] exists,up e7b8d3bd-9bc9-4d2b-9e6e-60179d2db921 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:164: TEST_interval_changes: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:166: TEST_interval_changes: run_osd td/osd-scrub-test 1 --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:41:09.074 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0' 2026-03-08T23:41:09.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:41:09.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:41:09.078 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 278fc43b-e7df-441c-a993-4efe0d04684a 2026-03-08T23:41:09.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=278fc43b-e7df-441c-a993-4efe0d04684a 2026-03-08T23:41:09.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 278fc43b-e7df-441c-a993-4efe0d04684a' 2026-03-08T23:41:09.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:41:09.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAVCa5pfKnpBRAA8czCIxce+N3uPXCopwvHsg== 2026-03-08T23:41:09.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAVCa5pfKnpBRAA8czCIxce+N3uPXCopwvHsg=="}' 2026-03-08T23:41:09.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 278fc43b-e7df-441c-a993-4efe0d04684a -i td/osd-scrub-test/1/new.json 2026-03-08T23:41:09.255 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:41:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:41:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 --mkfs --key AQAVCa5pfKnpBRAA8czCIxce+N3uPXCopwvHsg== --osd-uuid 278fc43b-e7df-441c-a993-4efe0d04684a 2026-03-08T23:41:09.287 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:09.294+0000 7f835a0008c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:09.289 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:09.294+0000 7f835a0008c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:09.290 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:09.294+0000 7f835a0008c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:09.290 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:09.294+0000 7f835a0008c0 -1 bdev(0x55cb17c47c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:41:09.290 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:09.298+0000 7f835a0008c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:41:13.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:41:13.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:41:13.784 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:41:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:41:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:41:14.006 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:41:14.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:41:14.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_scrub_min_interval=86400 --osd_scrub_max_interval=604800 --osd_scrub_interval_randomize_ratio=0 2026-03-08T23:41:14.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:41:14.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:41:14.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:41:14.024 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:14.030+0000 7fd1370a88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:14.025 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:14.030+0000 7fd1370a88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:14.027 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:14.034+0000 7fd1370a88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:41:14.190 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:14.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:41:14.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:14.731 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:14.738+0000 7fd1370a88c0 -1 Falling back to public interface 2026-03-08T23:41:15.414 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:41:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:41:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:15.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:41:15.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:15.709 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:41:15.714+0000 7fd1370a88c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:16.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:41:16.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:41:17.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/4162820862,v1:127.0.0.1:6811/4162820862] [v2:127.0.0.1:6812/4162820862,v1:127.0.0.1:6813/4162820862] exists,up 278fc43b-e7df-441c-a993-4efe0d04684a 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:170: TEST_interval_changes: create_pool test 1 1 2026-03-08T23:41:17.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:41:18.225 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:41:18.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:171: TEST_interval_changes: wait_for_clean 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:41:19.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:41:19.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:41:19.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:41:19.308 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:41:19.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:41:19.481 INFO:tasks.workunit.client.0.vm03.stderr:1' 2026-03-08T23:41:19.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:41:19.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:41:19.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:41:19.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836493 2026-03-08T23:41:19.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836493 2026-03-08T23:41:19.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493' 2026-03-08T23:41:19.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:41:19.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836493 1-42949672963' 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836493 2026-03-08T23:41:19.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:41:19.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:41:19.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836493 2026-03-08T23:41:19.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:41:19.636 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836493 2026-03-08T23:41:19.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836493 2026-03-08T23:41:19.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836493' 2026-03-08T23:41:19.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:41:19.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836493 2026-03-08T23:41:19.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:41:20.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:41:20.944 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:41:21.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836493 2026-03-08T23:41:21.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:41:22.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:41:22.123 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:41:22.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836493 2026-03-08T23:41:22.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:41:22.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T23:41:22.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:41:22.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:41:22.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T23:41:22.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:41:22.307 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T23:41:22.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T23:41:22.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T23:41:22.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:41:22.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672965 -lt 42949672963 2026-03-08T23:41:22.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:41:22.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:41:22.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:41:22.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:41:22.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:41:22.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:41:22.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:41:22.847 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:41:23.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:41:23.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:41:23.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:41:23.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:172: TEST_interval_changes: ceph osd dump 2026-03-08T23:41:23.055 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:172: TEST_interval_changes: awk '{ print $2 }' 2026-03-08T23:41:23.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:172: TEST_interval_changes: grep '^pool.*['\'']test['\'']' 2026-03-08T23:41:23.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:172: TEST_interval_changes: local poolid=1 2026-03-08T23:41:23.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:174: TEST_interval_changes: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:41:23.230 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:41:23.230 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:41:23.230 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 8.4578e-05 s, 12.2 MB/s 2026-03-08T23:41:23.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: seq 1 10 2026-03-08T23:41:23.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj1 testdata.475827 2026-03-08T23:41:23.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj2 testdata.475827 2026-03-08T23:41:23.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj3 testdata.475827 2026-03-08T23:41:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj4 testdata.475827 2026-03-08T23:41:23.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj5 testdata.475827 2026-03-08T23:41:23.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj6 testdata.475827 2026-03-08T23:41:23.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj7 testdata.475827 2026-03-08T23:41:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj8 testdata.475827 2026-03-08T23:41:23.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj9 testdata.475827 2026-03-08T23:41:23.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:175: TEST_interval_changes: for i in `seq 1 $objects` 2026-03-08T23:41:23.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:177: TEST_interval_changes: rados -p test put obj10 testdata.475827 2026-03-08T23:41:23.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:179: TEST_interval_changes: rm -f testdata.475827 2026-03-08T23:41:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:181: TEST_interval_changes: get_primary test obj1 2026-03-08T23:41:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:41:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:41:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:41:23.484 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:41:23.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:181: TEST_interval_changes: local primary=1 2026-03-08T23:41:23.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:184: TEST_interval_changes: check_dump_scrubs 1 '1 day' '1 week' 2026-03-08T23:41:23.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:135: check_dump_scrubs: local primary=1 2026-03-08T23:41:23.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:136: check_dump_scrubs: local 'sched_time_check=1 day' 2026-03-08T23:41:23.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:137: check_dump_scrubs: local 'deadline_check=1 week' 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: get_asok_path osd.1 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:23.655 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:41:23.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: CEPH_ARGS= 2026-03-08T23:41:23.656 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok dump_scrubs 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: DS='[ 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr: "pgid": "1.0", 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr: "sched_time": "2026-03-09T23:41:18.233075+0000", 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr: "orig_sched_time": "2026-03-09T23:41:18.233075+0000", 2026-03-08T23:41:23.723 INFO:tasks.workunit.client.0.vm03.stderr: "deadline": "2026-03-15T23:41:18.233075+0000", 2026-03-08T23:41:23.724 INFO:tasks.workunit.client.0.vm03.stderr: "forced": false 2026-03-08T23:41:23.724 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-08T23:41:23.724 INFO:tasks.workunit.client.0.vm03.stderr:]' 2026-03-08T23:41:23.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-09T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-09T23:41:18.233075+0000",' '"deadline":' '"2026-03-15T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:23.724 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: jq '.[0].sched_time' 2026-03-08T23:41:23.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: eval 'SCHED_TIME="2026-03-09T23:41:18.233075+0000"' 2026-03-08T23:41:23.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: SCHED_TIME=2026-03-09T23:41:18.233075+0000 2026-03-08T23:41:23.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: echo 2026-03-09T23:41:18.233075+0000 2026-03-08T23:41:23.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:23.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: date +%Y-%m-%d -d 'now + 1 day' 2026-03-08T23:41:23.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: test 2026-03-09 = 2026-03-09 2026-03-08T23:41:23.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-09T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-09T23:41:18.233075+0000",' '"deadline":' '"2026-03-15T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:23.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: jq '.[0].deadline' 2026-03-08T23:41:23.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: eval 'DEADLINE="2026-03-15T23:41:18.233075+0000"' 2026-03-08T23:41:23.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: DEADLINE=2026-03-15T23:41:18.233075+0000 2026-03-08T23:41:23.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: echo 2026-03-15T23:41:18.233075+0000 2026-03-08T23:41:23.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:23.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: date +%Y-%m-%d -d 'now + 1 week' 2026-03-08T23:41:23.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: test 2026-03-15 = 2026-03-15 2026-03-08T23:41:23.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:187: TEST_interval_changes: get_asok_path osd.1 2026-03-08T23:41:23.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:41:23.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:41:23.752 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:23.752 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:23.752 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:23.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:41:23.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:187: TEST_interval_changes: expr 86400 '*' 2 2026-03-08T23:41:23.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:187: TEST_interval_changes: CEPH_ARGS= 2026-03-08T23:41:23.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:187: TEST_interval_changes: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok config set osd_scrub_min_interval 172800 2026-03-08T23:41:23.813 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:41:23.814 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_delete_sleep = '' osd_delete_sleep_hdd = '' osd_delete_sleep_hybrid = '' osd_delete_sleep_ssd = '' osd_max_backfills = '' osd_recovery_max_active = '' osd_recovery_max_active_hdd = '' osd_recovery_max_active_ssd = '' osd_recovery_sleep = '' osd_recovery_sleep_degraded = '' osd_recovery_sleep_degraded_hdd = '' osd_recovery_sleep_degraded_hybrid = '' osd_recovery_sleep_degraded_ssd = '' osd_recovery_sleep_hdd = '' osd_recovery_sleep_hybrid = '' osd_recovery_sleep_ssd = '' osd_scrub_min_interval = '' osd_scrub_sleep = '' osd_snap_trim_sleep = '' osd_snap_trim_sleep_hdd = '' osd_snap_trim_sleep_hybrid = '' osd_snap_trim_sleep_ssd = '' " 2026-03-08T23:41:23.814 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:41:23.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:188: TEST_interval_changes: sleep 15 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:189: TEST_interval_changes: check_dump_scrubs 1 '2 days' '1 week' 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:135: check_dump_scrubs: local primary=1 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:136: check_dump_scrubs: local 'sched_time_check=2 days' 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:137: check_dump_scrubs: local 'deadline_check=1 week' 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: get_asok_path osd.1 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:41:38.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:41:38.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:38.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:38.825 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:38.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:41:38.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: CEPH_ARGS= 2026-03-08T23:41:38.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok dump_scrubs 2026-03-08T23:41:38.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: DS='[ 2026-03-08T23:41:38.890 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-08T23:41:38.890 INFO:tasks.workunit.client.0.vm03.stderr: "pgid": "1.0", 2026-03-08T23:41:38.890 INFO:tasks.workunit.client.0.vm03.stderr: "sched_time": "2026-03-10T23:41:18.233075+0000", 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr: "orig_sched_time": "2026-03-10T23:41:18.233075+0000", 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr: "deadline": "2026-03-15T23:41:18.233075+0000", 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr: "forced": false 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr:]' 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"deadline":' '"2026-03-15T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:38.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: jq '.[0].sched_time' 2026-03-08T23:41:38.905 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: eval 'SCHED_TIME="2026-03-10T23:41:18.233075+0000"' 2026-03-08T23:41:38.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: SCHED_TIME=2026-03-10T23:41:18.233075+0000 2026-03-08T23:41:38.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:38.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: echo 2026-03-10T23:41:18.233075+0000 2026-03-08T23:41:38.907 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: date +%Y-%m-%d -d 'now + 2 days' 2026-03-08T23:41:38.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: test 2026-03-10 = 2026-03-10 2026-03-08T23:41:38.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"deadline":' '"2026-03-15T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:38.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: jq '.[0].deadline' 2026-03-08T23:41:38.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: eval 'DEADLINE="2026-03-15T23:41:18.233075+0000"' 2026-03-08T23:41:38.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: DEADLINE=2026-03-15T23:41:18.233075+0000 2026-03-08T23:41:38.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: echo 2026-03-15T23:41:18.233075+0000 2026-03-08T23:41:38.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:38.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: date +%Y-%m-%d -d 'now + 1 week' 2026-03-08T23:41:38.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: test 2026-03-15 = 2026-03-15 2026-03-08T23:41:38.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:192: TEST_interval_changes: get_asok_path osd.1 2026-03-08T23:41:38.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:41:38.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:41:38.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:38.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:38.922 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:38.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:41:38.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:192: TEST_interval_changes: expr 604800 '*' 2 2026-03-08T23:41:38.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:192: TEST_interval_changes: CEPH_ARGS= 2026-03-08T23:41:38.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:192: TEST_interval_changes: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok config set osd_scrub_max_interval 1209600 2026-03-08T23:41:38.983 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:41:38.983 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_max_interval = '' " 2026-03-08T23:41:38.983 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:41:38.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:193: TEST_interval_changes: sleep 15 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:194: TEST_interval_changes: check_dump_scrubs 1 '2 days' '2 week' 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:135: check_dump_scrubs: local primary=1 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:136: check_dump_scrubs: local 'sched_time_check=2 days' 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:137: check_dump_scrubs: local 'deadline_check=2 week' 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: get_asok_path osd.1 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:41:53.995 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:41:53.996 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:41:53.996 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:41:53.996 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:41:53.996 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:41:53.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: CEPH_ARGS= 2026-03-08T23:41:53.997 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok dump_scrubs 2026-03-08T23:41:54.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: DS='[ 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: "pgid": "1.0", 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: "sched_time": "2026-03-10T23:41:18.233075+0000", 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: "orig_sched_time": "2026-03-10T23:41:18.233075+0000", 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: "deadline": "2026-03-22T23:41:18.233075+0000", 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: "forced": false 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr:]' 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"deadline":' '"2026-03-22T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:54.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: jq '.[0].sched_time' 2026-03-08T23:41:54.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: eval 'SCHED_TIME="2026-03-10T23:41:18.233075+0000"' 2026-03-08T23:41:54.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: SCHED_TIME=2026-03-10T23:41:18.233075+0000 2026-03-08T23:41:54.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: echo 2026-03-10T23:41:18.233075+0000 2026-03-08T23:41:54.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:54.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: date +%Y-%m-%d -d 'now + 2 days' 2026-03-08T23:41:54.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: test 2026-03-10 = 2026-03-10 2026-03-08T23:41:54.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-10T23:41:18.233075+0000",' '"deadline":' '"2026-03-22T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:41:54.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: jq '.[0].deadline' 2026-03-08T23:41:54.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: eval 'DEADLINE="2026-03-22T23:41:18.233075+0000"' 2026-03-08T23:41:54.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: DEADLINE=2026-03-22T23:41:18.233075+0000 2026-03-08T23:41:54.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: echo 2026-03-22T23:41:18.233075+0000 2026-03-08T23:41:54.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:41:54.100 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: date +%Y-%m-%d -d 'now + 2 week' 2026-03-08T23:41:54.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: test 2026-03-22 = 2026-03-22 2026-03-08T23:41:54.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:197: TEST_interval_changes: expr 86400 '*' 3 2026-03-08T23:41:54.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:197: TEST_interval_changes: ceph osd pool set test scrub_min_interval 259200 2026-03-08T23:41:54.309 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 scrub_min_interval to 259200 2026-03-08T23:41:54.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:198: TEST_interval_changes: sleep 15 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:199: TEST_interval_changes: check_dump_scrubs 1 '3 days' '2 week' 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:135: check_dump_scrubs: local primary=1 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:136: check_dump_scrubs: local 'sched_time_check=3 days' 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:137: check_dump_scrubs: local 'deadline_check=2 week' 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: get_asok_path osd.1 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:42:09.328 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: CEPH_ARGS= 2026-03-08T23:42:09.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok dump_scrubs 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: DS='[ 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: "pgid": "1.0", 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: "sched_time": "2026-03-11T23:41:18.233075+0000", 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: "orig_sched_time": "2026-03-11T23:41:18.233075+0000", 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: "deadline": "2026-03-22T23:41:18.233075+0000", 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: "forced": false 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr:]' 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"deadline":' '"2026-03-22T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:42:09.395 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: jq '.[0].sched_time' 2026-03-08T23:42:09.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: eval 'SCHED_TIME="2026-03-11T23:41:18.233075+0000"' 2026-03-08T23:42:09.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: SCHED_TIME=2026-03-11T23:41:18.233075+0000 2026-03-08T23:42:09.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: echo 2026-03-11T23:41:18.233075+0000 2026-03-08T23:42:09.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:42:09.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: date +%Y-%m-%d -d 'now + 3 days' 2026-03-08T23:42:09.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: test 2026-03-11 = 2026-03-11 2026-03-08T23:42:09.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"deadline":' '"2026-03-22T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:42:09.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: jq '.[0].deadline' 2026-03-08T23:42:09.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: eval 'DEADLINE="2026-03-22T23:41:18.233075+0000"' 2026-03-08T23:42:09.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: DEADLINE=2026-03-22T23:41:18.233075+0000 2026-03-08T23:42:09.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: echo 2026-03-22T23:41:18.233075+0000 2026-03-08T23:42:09.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:42:09.418 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: date +%Y-%m-%d -d 'now + 2 week' 2026-03-08T23:42:09.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: test 2026-03-22 = 2026-03-22 2026-03-08T23:42:09.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:202: TEST_interval_changes: expr 604800 '*' 3 2026-03-08T23:42:09.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:202: TEST_interval_changes: ceph osd pool set test scrub_max_interval 1814400 2026-03-08T23:42:09.622 INFO:tasks.workunit.client.0.vm03.stderr:set pool 1 scrub_max_interval to 1814400 2026-03-08T23:42:09.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:203: TEST_interval_changes: sleep 15 2026-03-08T23:42:24.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:204: TEST_interval_changes: check_dump_scrubs 1 '3 days' '3 week' 2026-03-08T23:42:24.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:135: check_dump_scrubs: local primary=1 2026-03-08T23:42:24.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:136: check_dump_scrubs: local 'sched_time_check=3 days' 2026-03-08T23:42:24.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:137: check_dump_scrubs: local 'deadline_check=3 week' 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: get_asok_path osd.1 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.1 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.1 ']' 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:24.639 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.1.asok 2026-03-08T23:42:24.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: CEPH_ARGS= 2026-03-08T23:42:24.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: ceph --admin-daemon /tmp/ceph-asok.475827/ceph-osd.1.asok dump_scrubs 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:139: check_dump_scrubs: DS='[ 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: { 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: "pgid": "1.0", 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: "sched_time": "2026-03-11T23:41:18.233075+0000", 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: "orig_sched_time": "2026-03-11T23:41:18.233075+0000", 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: "deadline": "2026-03-29T23:41:18.233075+0000", 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: "forced": false 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr: } 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr:]' 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"deadline":' '"2026-03-29T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:42:24.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: jq '.[0].sched_time' 2026-03-08T23:42:24.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: eval 'SCHED_TIME="2026-03-11T23:41:18.233075+0000"' 2026-03-08T23:42:24.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:141: check_dump_scrubs: SCHED_TIME=2026-03-11T23:41:18.233075+0000 2026-03-08T23:42:24.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: echo 2026-03-11T23:41:18.233075+0000 2026-03-08T23:42:24.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:42:24.728 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: date +%Y-%m-%d -d 'now + 3 days' 2026-03-08T23:42:24.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:142: check_dump_scrubs: test 2026-03-11 = 2026-03-11 2026-03-08T23:42:24.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: echo '[' '{' '"pgid":' '"1.0",' '"sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"orig_sched_time":' '"2026-03-11T23:41:18.233075+0000",' '"deadline":' '"2026-03-29T23:41:18.233075+0000",' '"forced":' false '}' ']' 2026-03-08T23:42:24.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: jq '.[0].deadline' 2026-03-08T23:42:24.739 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: eval 'DEADLINE="2026-03-29T23:41:18.233075+0000"' 2026-03-08T23:42:24.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:144: check_dump_scrubs: DEADLINE=2026-03-29T23:41:18.233075+0000 2026-03-08T23:42:24.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: echo 2026-03-29T23:41:18.233075+0000 2026-03-08T23:42:24.740 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: sed 's/\([0-9]*-[0-9]*-[0-9]*\).*/\1/' 2026-03-08T23:42:24.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: date +%Y-%m-%d -d 'now + 3 week' 2026-03-08T23:42:24.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:145: check_dump_scrubs: test 2026-03-29 = 2026-03-29 2026-03-08T23:42:24.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:205: TEST_interval_changes: perf_counters td/osd-scrub-test 2 2026-03-08T23:42:24.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:42:24.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=2 2026-03-08T23:42:24.742 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 2 - 1 2026-03-08T23:42:24.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 1 2026-03-08T23:42:24.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:42:24.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:42:24.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.816 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.817 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.818 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.819 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.819 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:42:24.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:42:24.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:42:24.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:42:24.897 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.898 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.899 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.900 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.901 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:42:24.902 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:42:24.909 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_interval_changes ------------------ 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_interval_changes ------------------' 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:42:24.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:42:25.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:42:25.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:42:25.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:42:25.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:42:25.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:42:25.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:42:25.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:42:25.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:42:25.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:42:25.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:42:25.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:42:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:42:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:42:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:42:25.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:42:25.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:25.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:25.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_interval_changes ------------------ 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_just_deep_scrubs ------------------- 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_interval_changes ------------------' 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_just_deep_scrubs -------------------' 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:42:25.045 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:42:25.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:42:25.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:42:25.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:42:25.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:42:25.047 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:42:25.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:42:25.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:42:25.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:42:25.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:42:25.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:42:25.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:42:25.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:42:25.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:42:25.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:42:25.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:42:25.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:42:25.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:42:25.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:42:25.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:25.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:25.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:42:25.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:42:25.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:42:25.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:42:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:42:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:25.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:25.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_just_deep_scrubs ----------------------- 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_just_deep_scrubs -----------------------' 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_just_deep_scrubs td/osd-scrub-test 2026-03-08T23:42:25.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:467: TEST_just_deep_scrubs: local dir=td/osd-scrub-test 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:468: TEST_just_deep_scrubs: cluster_conf=(['osds_num']='3' ['pgs_in_pool']='4' ['pool_name']='test') 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:468: TEST_just_deep_scrubs: local -A cluster_conf 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:474: TEST_just_deep_scrubs: standard_scrub_cluster td/osd-scrub-test cluster_conf 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:229: standard_scrub_cluster: local dir=td/osd-scrub-test 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:230: standard_scrub_cluster: local -n args=cluster_conf 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:232: standard_scrub_cluster: local OSDS=3 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:233: standard_scrub_cluster: local pg_num=4 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:234: standard_scrub_cluster: local poolname=test 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:235: standard_scrub_cluster: args['pool_name']=test 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:236: standard_scrub_cluster: local extra_pars= 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:237: standard_scrub_cluster: local debug_msg=dbg 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:240: standard_scrub_cluster: local saved_echo_flag=x 2026-03-08T23:42:25.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:241: standard_scrub_cluster: set +x 2026-03-08T23:42:25.414 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 2ec5fafc-c38f-45c0-8aaf-1e9d943e8b9b 2026-03-08T23:42:25.537 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:42:25.577 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:25.583+0000 7f33815408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:25.580 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:25.587+0000 7f33815408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:25.582 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:25.587+0000 7f33815408c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:25.582 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:25.587+0000 7f33815408c0 -1 bdev(0x5564c679ec00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:42:25.582 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:25.587+0000 7f33815408c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:42:27.845 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:42:27.960 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:42:28.014 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:28.011+0000 7f1a6357e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:28.029 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:28.035+0000 7f1a6357e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:28.038 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:28.043+0000 7f1a6357e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:28.144 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:42:29.315 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:42:29.487 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:29.495+0000 7f1a6357e8c0 -1 Falling back to public interface 2026-03-08T23:42:30.461 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:30.467+0000 7f1a6357e8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:42:30.506 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:42:31.711 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:42:31.882 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2520246369,v1:127.0.0.1:6803/2520246369] [v2:127.0.0.1:6804/2520246369,v1:127.0.0.1:6805/2520246369] exists,up 2ec5fafc-c38f-45c0-8aaf-1e9d943e8b9b 2026-03-08T23:42:31.886 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 50cac386-4524-48c1-be7b-1564b1b98398 2026-03-08T23:42:32.066 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:42:32.101 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:32.107+0000 7f7af6dad8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:32.103 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:32.111+0000 7f7af6dad8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:32.104 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:32.111+0000 7f7af6dad8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:32.104 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:32.111+0000 7f7af6dad8c0 -1 bdev(0x55dcb9299c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:42:32.104 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:32.111+0000 7f7af6dad8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:42:34.365 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:42:34.583 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:42:34.601 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:34.607+0000 7f7c71bdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:34.607 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:34.615+0000 7f7c71bdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:34.609 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:34.615+0000 7f7c71bdc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:34.763 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:42:35.071 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:35.079+0000 7f7c71bdc8c0 -1 Falling back to public interface 2026-03-08T23:42:35.938 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:42:36.056 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:36.063+0000 7f7c71bdc8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:42:37.125 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:42:37.337 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/127951744,v1:127.0.0.1:6811/127951744] [v2:127.0.0.1:6812/127951744,v1:127.0.0.1:6813/127951744] exists,up 50cac386-4524-48c1-be7b-1564b1b98398 2026-03-08T23:42:37.341 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 d40a2e7e-cda9-428d-9662-f748a28c6d7d 2026-03-08T23:42:37.563 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:42:37.606 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:37.615+0000 7f07ef3818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:37.608 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:37.615+0000 7f07ef3818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:37.610 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:37.615+0000 7f07ef3818c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:37.610 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:37.619+0000 7f07ef3818c0 -1 bdev(0x564ddfacfc00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:42:37.610 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:37.619+0000 7f07ef3818c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:42:39.872 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:42:40.089 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:42:40.109 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:40.115+0000 7f53de48f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:40.109 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:40.115+0000 7f53de48f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:40.111 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:40.119+0000 7f53de48f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:42:40.279 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:42:40.823 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:40.831+0000 7f53de48f8c0 -1 Falling back to public interface 2026-03-08T23:42:41.460 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:42:41.817 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:41.823+0000 7f53de48f8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:42:42.646 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:42:43.032 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:42:43.039+0000 7f53d9c48640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:42:43.851 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:42:44.034 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/4042942147,v1:127.0.0.1:6819/4042942147] [v2:127.0.0.1:6820/4042942147,v1:127.0.0.1:6821/4042942147] exists,up d40a2e7e-cda9-428d-9662-f748a28c6d7d 2026-03-08T23:42:44.268 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:42:45.822 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836484 2026-03-08T23:42:47.206 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T23:42:47.413 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stdout:standard_scrub_cluster: dbg: test pool is test 1 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stdout:Pool: test : 1 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:475: TEST_just_deep_scrubs: local poolid=1 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:476: TEST_just_deep_scrubs: local poolname=test 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:477: TEST_just_deep_scrubs: echo 'Pool: test : 1' 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:479: TEST_just_deep_scrubs: TESTDATA=testdata.475827 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:480: TEST_just_deep_scrubs: local objects=90 2026-03-08T23:42:49.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:481: TEST_just_deep_scrubs: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:42:49.393 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:42:49.393 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:42:49.393 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 5.9551e-05 s, 17.3 MB/s 2026-03-08T23:42:49.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: seq 1 90 2026-03-08T23:42:49.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj1 testdata.475827 2026-03-08T23:42:49.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj2 testdata.475827 2026-03-08T23:42:49.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj3 testdata.475827 2026-03-08T23:42:49.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj4 testdata.475827 2026-03-08T23:42:49.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj5 testdata.475827 2026-03-08T23:42:49.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj6 testdata.475827 2026-03-08T23:42:49.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj7 testdata.475827 2026-03-08T23:42:49.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj8 testdata.475827 2026-03-08T23:42:49.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj9 testdata.475827 2026-03-08T23:42:49.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj10 testdata.475827 2026-03-08T23:42:49.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj11 testdata.475827 2026-03-08T23:42:49.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj12 testdata.475827 2026-03-08T23:42:49.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj13 testdata.475827 2026-03-08T23:42:49.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj14 testdata.475827 2026-03-08T23:42:49.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.694 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj15 testdata.475827 2026-03-08T23:42:49.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj16 testdata.475827 2026-03-08T23:42:49.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj17 testdata.475827 2026-03-08T23:42:49.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj18 testdata.475827 2026-03-08T23:42:49.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj19 testdata.475827 2026-03-08T23:42:49.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj20 testdata.475827 2026-03-08T23:42:49.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj21 testdata.475827 2026-03-08T23:42:49.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj22 testdata.475827 2026-03-08T23:42:49.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj23 testdata.475827 2026-03-08T23:42:49.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj24 testdata.475827 2026-03-08T23:42:49.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj25 testdata.475827 2026-03-08T23:42:49.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:49.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj26 testdata.475827 2026-03-08T23:42:50.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj27 testdata.475827 2026-03-08T23:42:50.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj28 testdata.475827 2026-03-08T23:42:50.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj29 testdata.475827 2026-03-08T23:42:50.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj30 testdata.475827 2026-03-08T23:42:50.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj31 testdata.475827 2026-03-08T23:42:50.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj32 testdata.475827 2026-03-08T23:42:50.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj33 testdata.475827 2026-03-08T23:42:50.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj34 testdata.475827 2026-03-08T23:42:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj35 testdata.475827 2026-03-08T23:42:50.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj36 testdata.475827 2026-03-08T23:42:50.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj37 testdata.475827 2026-03-08T23:42:50.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj38 testdata.475827 2026-03-08T23:42:50.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj39 testdata.475827 2026-03-08T23:42:50.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj40 testdata.475827 2026-03-08T23:42:50.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj41 testdata.475827 2026-03-08T23:42:50.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj42 testdata.475827 2026-03-08T23:42:50.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj43 testdata.475827 2026-03-08T23:42:50.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj44 testdata.475827 2026-03-08T23:42:50.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj45 testdata.475827 2026-03-08T23:42:50.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj46 testdata.475827 2026-03-08T23:42:50.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj47 testdata.475827 2026-03-08T23:42:50.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj48 testdata.475827 2026-03-08T23:42:50.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj49 testdata.475827 2026-03-08T23:42:50.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj50 testdata.475827 2026-03-08T23:42:50.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj51 testdata.475827 2026-03-08T23:42:50.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj52 testdata.475827 2026-03-08T23:42:50.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj53 testdata.475827 2026-03-08T23:42:50.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj54 testdata.475827 2026-03-08T23:42:50.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj55 testdata.475827 2026-03-08T23:42:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj56 testdata.475827 2026-03-08T23:42:50.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj57 testdata.475827 2026-03-08T23:42:50.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj58 testdata.475827 2026-03-08T23:42:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj59 testdata.475827 2026-03-08T23:42:50.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj60 testdata.475827 2026-03-08T23:42:50.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj61 testdata.475827 2026-03-08T23:42:50.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:50.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj62 testdata.475827 2026-03-08T23:42:51.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj63 testdata.475827 2026-03-08T23:42:51.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj64 testdata.475827 2026-03-08T23:42:51.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj65 testdata.475827 2026-03-08T23:42:51.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj66 testdata.475827 2026-03-08T23:42:51.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj67 testdata.475827 2026-03-08T23:42:51.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj68 testdata.475827 2026-03-08T23:42:51.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj69 testdata.475827 2026-03-08T23:42:51.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj70 testdata.475827 2026-03-08T23:42:51.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj71 testdata.475827 2026-03-08T23:42:51.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj72 testdata.475827 2026-03-08T23:42:51.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj73 testdata.475827 2026-03-08T23:42:51.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj74 testdata.475827 2026-03-08T23:42:51.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj75 testdata.475827 2026-03-08T23:42:51.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj76 testdata.475827 2026-03-08T23:42:51.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj77 testdata.475827 2026-03-08T23:42:51.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj78 testdata.475827 2026-03-08T23:42:51.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.412 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj79 testdata.475827 2026-03-08T23:42:51.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj80 testdata.475827 2026-03-08T23:42:51.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj81 testdata.475827 2026-03-08T23:42:51.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj82 testdata.475827 2026-03-08T23:42:51.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj83 testdata.475827 2026-03-08T23:42:51.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj84 testdata.475827 2026-03-08T23:42:51.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj85 testdata.475827 2026-03-08T23:42:51.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj86 testdata.475827 2026-03-08T23:42:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj87 testdata.475827 2026-03-08T23:42:51.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj88 testdata.475827 2026-03-08T23:42:51.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj89 testdata.475827 2026-03-08T23:42:51.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:482: TEST_just_deep_scrubs: for i in `seq 1 $objects` 2026-03-08T23:42:51.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:484: TEST_just_deep_scrubs: rados -p test put obj90 testdata.475827 2026-03-08T23:42:51.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:486: TEST_just_deep_scrubs: rm -f testdata.475827 2026-03-08T23:42:51.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:491: TEST_just_deep_scrubs: ceph osd set noscrub 2026-03-08T23:42:51.915 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:42:51.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:492: TEST_just_deep_scrubs: ceph osd set nodeep-scrub 2026-03-08T23:42:52.124 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is set 2026-03-08T23:42:52.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:493: TEST_just_deep_scrubs: sleep 6 2026-03-08T23:42:58.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:494: TEST_just_deep_scrubs: date -Ins 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:494: TEST_just_deep_scrubs: local now_is=2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:495: TEST_just_deep_scrubs: declare -A sched_data 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:496: TEST_just_deep_scrubs: local pgid=1.2 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:499: TEST_just_deep_scrubs: set_query_debug 1.2 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:303: set_query_debug: local pgid=1.2 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: ceph pg dump pgs_brief 2026-03-08T23:42:58.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: awk -v 'pg=^1.2' -n -e '$0 ~ pg { print(gensub(/[^0-9]*([0-9]+).*/,"\\1","g",$5)); }' 2026-03-08T23:42:58.297 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs_brief 2026-03-08T23:42:58.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:305: set_query_debug: local prim_osd=0 2026-03-08T23:42:58.311 INFO:tasks.workunit.client.0.vm03.stdout:Setting scrub debug data. Primary for 1.2 is 0 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:307: set_query_debug: echo 'Setting scrub debug data. Primary for 1.2 is 0' 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: get_asok_path osd.0 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=osd.0 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n osd.0 ']' 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-osd.0.asok 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: CEPH_ARGS= 2026-03-08T23:42:58.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:308: set_query_debug: ceph --format=json daemon /tmp/ceph-asok.475827/ceph-osd.0.asok scrubdebug 1.2 set sessions 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:501: TEST_just_deep_scrubs: extract_published_sch 1.2 2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:42:58,152134631+00:00 sched_data 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:17: extract_published_sch: local pgn=1.2 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:18: extract_published_sch: local -n dict=sched_data 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:19: extract_published_sch: local current_time=2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:20: extract_published_sch: local extra_time=2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:21: extract_published_sch: local extr_dbg=2 2026-03-08T23:42:58.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:24: extract_published_sch: local saved_echo_flag=x 2026-03-08T23:42:58.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:25: extract_published_sch: set +x 2026-03-08T23:42:58.564 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout:{"success":true}from pg dump pg: { 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 0, 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-09T23:42:44.273087+0000", 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 21, 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 36 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:42:58.579 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-09T23:42:44.273087+0000", 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-09T23:42:44.273 (2026-03-09T23:42:44.273)", 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "test_sequence": 8135680 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:42:58.668 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=21 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=37 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-09T23:42:44.273' 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-09T23:42:44.273' 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=0 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-08T23:42:44.273087+0000' 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='0x0' 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=true 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:42:58.778 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=8135680 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=0 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-09T23:42:44.273087+0000' 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=21 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=36 2026-03-08T23:42:58.779 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:42:58.790 INFO:tasks.workunit.client.0.vm03.stdout:test counter @ start: 8135680 2026-03-08T23:42:58.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:502: TEST_just_deep_scrubs: local saved_last_stamp=2026-03-08T23:42:44.273087+0000 2026-03-08T23:42:58.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:503: TEST_just_deep_scrubs: local dbg_counter_at_start=8135680 2026-03-08T23:42:58.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:504: TEST_just_deep_scrubs: echo 'test counter @ start: 8135680' 2026-03-08T23:42:58.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:506: TEST_just_deep_scrubs: ceph tell 1.2 schedule-deep-scrub 2026-03-08T23:42:58.866 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:42:58.866 INFO:tasks.workunit.client.0.vm03.stdout: "deep": true, 2026-03-08T23:42:58.866 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:42:58.866 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-02-22T23:41:18.876621+0000" 2026-03-08T23:42:58.866 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:42:58.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:508: TEST_just_deep_scrubs: sleep 5 2026-03-08T23:43:03.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:510: TEST_just_deep_scrubs: ceph pg dump pgs --format=json-pretty 2026-03-08T23:43:03.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:510: TEST_just_deep_scrubs: jq -r '.pg_stats[] | "\(.pgid) \(.stat_sum.num_objects)"' 2026-03-08T23:43:04.049 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stdout:1.3 20 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stdout:1.2 20 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stdout:1.1 25 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stdout:1.0 25 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:511: TEST_just_deep_scrubs: ceph pg 1.2 query --format=json-pretty 2026-03-08T23:43:04.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:511: TEST_just_deep_scrubs: jq -r .info.stats.stat_sum.num_objects 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stdout:Objects # in pg 1.2: 20 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:511: TEST_just_deep_scrubs: echo 'Objects # in pg 1.2: ' 20 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:513: TEST_just_deep_scrubs: declare -A sc_data_2 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:514: TEST_just_deep_scrubs: extract_published_sch 1.2 2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:42:58,152134631+00:00 sc_data_2 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:17: extract_published_sch: local pgn=1.2 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:18: extract_published_sch: local -n dict=sc_data_2 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:19: extract_published_sch: local current_time=2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:20: extract_published_sch: local extra_time=2026-03-08T23:42:58,152134631+00:00 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:21: extract_published_sch: local extr_dbg=2 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:24: extract_published_sch: local saved_echo_flag=x 2026-03-08T23:43:04.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:25: extract_published_sch: set +x 2026-03-08T23:43:04.309 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 0, 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "queued for deep scrub", 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "0", 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": false, 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": false, 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 21, 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 38 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:04.323 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-08T23:43:04.641422+0000", 2026-03-08T23:43:04.409 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "deep scrub scheduled @ 2026-03-08T23:43:04.641 (2026-02-23T23:41:18.876)", 2026-03-08T23:43:04.410 INFO:tasks.workunit.client.0.vm03.stdout: "test_sequence": 8135680 2026-03-08T23:43:04.410 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:04.410 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=21 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=39 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='deep scrub scheduled' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-08T23:43:04.641' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-02-23T23:41:18.876' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=0 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-02-22T23:41:18.876621+0000' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='0x0' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=false 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=8135680 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=0 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='queued for deep scrub' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='0' 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=false 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=false 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=21 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=38 2026-03-08T23:43:04.514 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:43:04.526 INFO:tasks.workunit.client.0.vm03.stdout:test counter @ should show no change: 8135680 2026-03-08T23:43:04.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:515: TEST_just_deep_scrubs: echo 'test counter @ should show no change: ' 8135680 2026-03-08T23:43:04.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:516: TEST_just_deep_scrubs: (( 0 == 0 )) 2026-03-08T23:43:04.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:517: TEST_just_deep_scrubs: (( 8135680 == 8135680 )) 2026-03-08T23:43:04.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:520: TEST_just_deep_scrubs: ceph osd unset nodeep-scrub 2026-03-08T23:43:04.742 INFO:tasks.workunit.client.0.vm03.stderr:nodeep-scrub is unset 2026-03-08T23:43:04.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:521: TEST_just_deep_scrubs: sleep 5 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:522: TEST_just_deep_scrubs: expct_qry_duration=(['query_last_duration']='0' ['query_last_duration_neg']='not0') 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:522: TEST_just_deep_scrubs: declare -A expct_qry_duration 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stdout:test counter @ should be higher than before the unset: 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:523: TEST_just_deep_scrubs: sc_data_2=() 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:524: TEST_just_deep_scrubs: echo 'test counter @ should be higher than before the unset: ' 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:525: TEST_just_deep_scrubs: wait_any_cond 1.2 10 2026-03-08T23:42:44.273087+0000 expct_qry_duration 'WaitingAfterScrub ' sc_data_2 2026-03-08T23:43:09.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:106: wait_any_cond: local pgid=1.2 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:107: wait_any_cond: local retries=10 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:108: wait_any_cond: local cmp_date=2026-03-08T23:42:44.273087+0000 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:109: wait_any_cond: local -n ep=expct_qry_duration 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:110: wait_any_cond: local -n out_array=sc_data_2 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:111: wait_any_cond: local -A sc_data 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:112: wait_any_cond: local extr_dbg=2 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:115: wait_any_cond: local saved_echo_flag=x 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:116: wait_any_cond: set +x 2026-03-08T23:43:09.760 INFO:tasks.workunit.client.0.vm03.stdout:waiting for any condition (WaitingAfterScrub ): pg:1.2 dt:2026-03-08T23:42:44.273087+0000 (10 retries) 2026-03-08T23:43:10.425 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout:from pg dump pg: { 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_pg_state": "active+clean", 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_state_has_scrubbing": false, 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_last_duration": 1, 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule": "periodic scrub scheduled", 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_schedule_at": "2026-03-09T23:43:05.584881+0000", 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_is_future": true, 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_vs_date": true, 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_reported_epoch": 22, 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout: "dmp_seq": 49 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:10.438 INFO:tasks.workunit.client.0.vm03.stdout:query output: 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "scrubber": { 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "active": false, 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "must_scrub": false, 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "must_deep_scrub": false, 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "must_repair": false, 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "need_auto": false, 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reg_stamp": "2026-03-09T23:43:05.584881+0000", 2026-03-08T23:43:10.522 INFO:tasks.workunit.client.0.vm03.stdout: "schedule": "scrub scheduled @ 2026-03-09T23:43:05.584 (2026-03-09T23:43:05.584)", 2026-03-08T23:43:10.523 INFO:tasks.workunit.client.0.vm03.stdout: "test_sequence": 8135683 2026-03-08T23:43:10.523 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.523 INFO:tasks.workunit.client.0.vm03.stdout: "agent_state": {} 2026-03-08T23:43:10.609 INFO:tasks.workunit.client.0.vm03.stdout:( 2026-03-08T23:43:10.609 INFO:tasks.workunit.client.0.vm03.stdout:[query_epoch]=22 2026-03-08T23:43:10.609 INFO:tasks.workunit.client.0.vm03.stdout:[query_seq]=50 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_active]=false 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule]='scrub scheduled' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_schedule_at]='2026-03-09T23:43:05.584' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_target_at]='2026-03-09T23:43:05.584' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_duration]=1 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_stamp]='2026-03-08T23:43:05.584881+0000' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_last_scrub]='19x20' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_is_future]=true 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_vs_date]=true 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[query_scrub_seq]=8135683 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_pg_state]='active+clean' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_state_has_scrubbing]=false 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_last_duration]=1 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule]='periodic scrub scheduled' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_schedule_at]='2026-03-09T23:43:05.584881+0000' 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_is_future]=true 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_vs_date]=true 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_reported_epoch]=22 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:[dmp_seq]=49 2026-03-08T23:43:10.610 INFO:tasks.workunit.client.0.vm03.stdout:) 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stdout:--> loop: 1 ~ false / 50 / 49 / true / 2026-03-08T23:43:05.584881+0000 / scrub scheduled %%% query_last_duration_neg query_last_duration 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stdout:key is query_last_duration: negation:1 # expected: 0 # in actual: 1 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stdout:WaitingAfterScrub - 'query_last_duration' actual value (1) matches expected (0) (negation: 1) 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/scrub-helpers.sh:157: wait_any_cond: return 0 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:526: TEST_just_deep_scrubs: perf_counters td/osd-scrub-test 3 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:43:10.620 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:43:10.621 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:43:10.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:10.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:43:10.621 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.691 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.012000052, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.012000052 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 2, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.004000017, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.004000017 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.692 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.693 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.694 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.702 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:43:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.778 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.779 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.780 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.781 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.789 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:43:10.790 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:10.859 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.860 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.861 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:10.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_just_deep_scrubs ------------------ 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_just_deep_scrubs ------------------' 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:10.871 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:10.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:10.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:10.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:10.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:10.988 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:10.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:10.989 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:10.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:10.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:10.990 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:10.991 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:10.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:10.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:10.992 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:11.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:11.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_just_deep_scrubs ------------------ 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_pg_dump_objects_scrubbed ------------------- 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_just_deep_scrubs ------------------' 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_pg_dump_objects_scrubbed -------------------' 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:11.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:11.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:11.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:11.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:11.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:11.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:11.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:11.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:11.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:11.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:11.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:11.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:11.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:11.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:11.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:11.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:11.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:43:11.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:43:11.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:43:11.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_pg_dump_objects_scrubbed ----------------------- 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_pg_dump_objects_scrubbed -----------------------' 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_pg_dump_objects_scrubbed td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:651: TEST_pg_dump_objects_scrubbed: local dir=td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:652: TEST_pg_dump_objects_scrubbed: local poolname=test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:653: TEST_pg_dump_objects_scrubbed: local OSDS=3 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:654: TEST_pg_dump_objects_scrubbed: local objects=15 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:655: TEST_pg_dump_objects_scrubbed: local timeout=10 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:657: TEST_pg_dump_objects_scrubbed: TESTDATA=testdata.475827 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:659: TEST_pg_dump_objects_scrubbed: setup td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:11.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:11.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:11.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:11.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:11.025 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:11.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:11.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:11.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:11.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:11.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:11.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:11.027 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:11.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:11.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:11.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:11.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:11.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:11.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:11.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:11.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:43:11.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:43:11.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:660: TEST_pg_dump_objects_scrubbed: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:43:11.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:43:11.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:43:11.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:11.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:11.059 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:11.060 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.060 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.060 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:11.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:43:11.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:43:11.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:43:11.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:43:11.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:43:11.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:43:11.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:43:11.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:43:11.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:43:11.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:43:11.094 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:43:11.094 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.095 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.095 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:43:11.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:43:11.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:43:11.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:43:11.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:43:11.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:43:11.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.164 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:43:11.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:43:11.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:661: TEST_pg_dump_objects_scrubbed: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:43:11.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:43:11.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:11.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:43:11.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:43:11.366 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:662: TEST_pg_dump_objects_scrubbed: expr 3 - 1 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:662: TEST_pg_dump_objects_scrubbed: seq 0 2 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:662: TEST_pg_dump_objects_scrubbed: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:664: TEST_pg_dump_objects_scrubbed: run_osd td/osd-scrub-test 0 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:11.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:11.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:11.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:11.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:11.374 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:11.374 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:11.374 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:11.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:43:11.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:43:11.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:11.376 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 07a4b490-de5b-44ea-8b9e-6535e841a4a8 2026-03-08T23:43:11.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=07a4b490-de5b-44ea-8b9e-6535e841a4a8 2026-03-08T23:43:11.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 07a4b490-de5b-44ea-8b9e-6535e841a4a8' 2026-03-08T23:43:11.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:11.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCPCa5pBsupFxAAsHGztCX/F4ygBNQ6eSwklw== 2026-03-08T23:43:11.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCPCa5pBsupFxAAsHGztCX/F4ygBNQ6eSwklw=="}' 2026-03-08T23:43:11.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 07a4b490-de5b-44ea-8b9e-6535e841a4a8 -i td/osd-scrub-test/0/new.json 2026-03-08T23:43:11.494 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:11.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:43:11.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCPCa5pBsupFxAAsHGztCX/F4ygBNQ6eSwklw== --osd-uuid 07a4b490-de5b-44ea-8b9e-6535e841a4a8 2026-03-08T23:43:11.522 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:11.527+0000 7f3d4fa6c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:11.523 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:11.531+0000 7f3d4fa6c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:11.525 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:11.531+0000 7f3d4fa6c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:11.525 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:11.531+0000 7f3d4fa6c8c0 -1 bdev(0x55615e056c00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:11.525 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:11.531+0000 7f3d4fa6c8c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:43:13.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:43:13.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:13.784 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:43:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:43:13.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:13.895 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:43:13.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:43:13.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:13.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:13.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:13.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:43:13.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:13.947+0000 7efd36d8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:13.940 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:13.947+0000 7efd36d8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:13.942 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:13.947+0000 7efd36d8d8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:14.031 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:14.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:14.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:14.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:14.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:14.903 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:14.911+0000 7efd36d8d8c0 -1 Falling back to public interface 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:15.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:15.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:16.124 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:16.131+0000 7efd36d8d8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:43:16.344 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:16.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:16.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:17.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:17.732 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/333832350,v1:127.0.0.1:6803/333832350] [v2:127.0.0.1:6804/333832350,v1:127.0.0.1:6805/333832350] exists,up 07a4b490-de5b-44ea-8b9e-6535e841a4a8 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:662: TEST_pg_dump_objects_scrubbed: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:664: TEST_pg_dump_objects_scrubbed: run_osd td/osd-scrub-test 1 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:17.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:17.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:17.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:43:17.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:43:17.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:17.735 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 76cd5745-95b8-478c-8549-a804ad7aae46 2026-03-08T23:43:17.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=76cd5745-95b8-478c-8549-a804ad7aae46 2026-03-08T23:43:17.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 76cd5745-95b8-478c-8549-a804ad7aae46' 2026-03-08T23:43:17.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:17.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCVCa5p4wwsLRAAId8cKJk+qIfiw4ul3tiHzA== 2026-03-08T23:43:17.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCVCa5p4wwsLRAAId8cKJk+qIfiw4ul3tiHzA=="}' 2026-03-08T23:43:17.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 76cd5745-95b8-478c-8549-a804ad7aae46 -i td/osd-scrub-test/1/new.json 2026-03-08T23:43:17.920 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:17.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:43:17.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCVCa5p4wwsLRAAId8cKJk+qIfiw4ul3tiHzA== --osd-uuid 76cd5745-95b8-478c-8549-a804ad7aae46 2026-03-08T23:43:17.951 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:17.959+0000 7fb7d402a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:17.953 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:17.959+0000 7fb7d402a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:17.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:17.963+0000 7fb7d402a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:17.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:17.963+0000 7fb7d402a8c0 -1 bdev(0x56346abe9c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:17.955 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:17.963+0000 7fb7d402a8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:43:20.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:43:20.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:20.196 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:43:20.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:43:20.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:20.415 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:43:20.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:43:20.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:43:20.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:20.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:20.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:20.431 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:20.439+0000 7fb2e29bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:20.432 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:20.439+0000 7fb2e29bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:20.434 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:20.439+0000 7fb2e29bd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:20.598 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:20.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:43:20.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:20.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:20.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:21.139 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:21.147+0000 7fb2e29bd8c0 -1 Falling back to public interface 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:21.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:21.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:22.113 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:22.119+0000 7fb2e29bd8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:43:22.971 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:22.971 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:22.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:22.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:22.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:22.972 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:23.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:23.959 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:23.967+0000 7fb2de176640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:43:24.177 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:43:24.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:24.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:24.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:43:24.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:24.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1942087657,v1:127.0.0.1:6811/1942087657] [v2:127.0.0.1:6812/1942087657,v1:127.0.0.1:6813/1942087657] exists,up 76cd5745-95b8-478c-8549-a804ad7aae46 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:662: TEST_pg_dump_objects_scrubbed: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:664: TEST_pg_dump_objects_scrubbed: run_osd td/osd-scrub-test 2 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:24.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:24.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:43:24.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:43:24.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:24.363 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 0c135aaf-b394-4da1-8fe9-6aff23b76ecb 2026-03-08T23:43:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0c135aaf-b394-4da1-8fe9-6aff23b76ecb 2026-03-08T23:43:24.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 0c135aaf-b394-4da1-8fe9-6aff23b76ecb' 2026-03-08T23:43:24.363 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:24.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCcCa5pXLjqFhAAr8+rVmf3lCFVchxHMi066g== 2026-03-08T23:43:24.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCcCa5pXLjqFhAAr8+rVmf3lCFVchxHMi066g=="}' 2026-03-08T23:43:24.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0c135aaf-b394-4da1-8fe9-6aff23b76ecb -i td/osd-scrub-test/2/new.json 2026-03-08T23:43:24.543 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:24.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:43:24.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQCcCa5pXLjqFhAAr8+rVmf3lCFVchxHMi066g== --osd-uuid 0c135aaf-b394-4da1-8fe9-6aff23b76ecb 2026-03-08T23:43:24.575 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:24.583+0000 7f3c79f9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:24.577 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:24.583+0000 7f3c79f9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:24.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:24.587+0000 7f3c79f9a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:24.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:24.587+0000 7f3c79f9a8c0 -1 bdev(0x560abb17bc00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:24.578 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:24.587+0000 7f3c79f9a8c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:43:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:43:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:27.071 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:43:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:43:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:27.275 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:43:27.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:43:27.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:43:27.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:27.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:27.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:27.290 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:27.295+0000 7fbb94c358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:27.295 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:27.303+0000 7fbb94c358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:27.296 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:27.303+0000 7fbb94c358c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:27.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:27.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:27.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:27.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:28.247 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:28.255+0000 7fbb94c358c0 -1 Falling back to public interface 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:28.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:28.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:29.219 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:29.227+0000 7fbb94c358c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:29.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:30.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:30.242 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:30.251+0000 7fbb903ee640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:43:31.012 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:43:31.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:31.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:31.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:43:31.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:31.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:31.196 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3846942407,v1:127.0.0.1:6819/3846942407] [v2:127.0.0.1:6820/3846942407,v1:127.0.0.1:6821/3846942407] exists,up 0c135aaf-b394-4da1-8fe9-6aff23b76ecb 2026-03-08T23:43:31.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:31.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:31.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:31.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:668: TEST_pg_dump_objects_scrubbed: create_pool test 1 1 2026-03-08T23:43:31.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:43:31.460 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:43:31.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:669: TEST_pg_dump_objects_scrubbed: wait_for_clean 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:43:32.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:43:32.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:32.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:43:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836497 2026-03-08T23:43:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836497 2026-03-08T23:43:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497' 2026-03-08T23:43:32.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:32.792 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:43:32.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T23:43:32.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T23:43:32.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672970' 2026-03-08T23:43:32.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:32.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:43:32.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672970 2-64424509444' 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836497 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:32.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:43:32.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836497 2026-03-08T23:43:32.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:32.966 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836497 2026-03-08T23:43:32.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836497 2026-03-08T23:43:32.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836497' 2026-03-08T23:43:32.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:43:33.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836495 -lt 21474836497 2026-03-08T23:43:33.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:43:34.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:43:34.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:43:34.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836498 -lt 21474836497 2026-03-08T23:43:34.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:34.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T23:43:34.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:34.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:43:34.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T23:43:34.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:34.325 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672970 2026-03-08T23:43:34.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T23:43:34.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T23:43:34.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:43:34.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672971 -lt 42949672970 2026-03-08T23:43:34.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:34.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T23:43:34.493 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:34.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:43:34.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T23:43:34.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:34.495 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509444 2026-03-08T23:43:34.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T23:43:34.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T23:43:34.496 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:43:34.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509444 2026-03-08T23:43:34.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:43:34.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:43:34.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:43:34.892 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:43:35.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:43:35.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:43:35.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:43:35.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:43:35.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:43:35.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:43:35.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:43:35.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:670: TEST_pg_dump_objects_scrubbed: ceph osd dump 2026-03-08T23:43:35.282 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:670: TEST_pg_dump_objects_scrubbed: awk '{ print $2 }' 2026-03-08T23:43:35.283 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:670: TEST_pg_dump_objects_scrubbed: grep '^pool.*['\'']test['\'']' 2026-03-08T23:43:35.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:670: TEST_pg_dump_objects_scrubbed: poolid=1 2026-03-08T23:43:35.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:672: TEST_pg_dump_objects_scrubbed: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:43:35.455 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:43:35.455 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:43:35.455 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 6.6895e-05 s, 15.4 MB/s 2026-03-08T23:43:35.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: seq 1 15 2026-03-08T23:43:35.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj1 testdata.475827 2026-03-08T23:43:35.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj2 testdata.475827 2026-03-08T23:43:35.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj3 testdata.475827 2026-03-08T23:43:35.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj4 testdata.475827 2026-03-08T23:43:35.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj5 testdata.475827 2026-03-08T23:43:35.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj6 testdata.475827 2026-03-08T23:43:35.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj7 testdata.475827 2026-03-08T23:43:35.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj8 testdata.475827 2026-03-08T23:43:35.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj9 testdata.475827 2026-03-08T23:43:35.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj10 testdata.475827 2026-03-08T23:43:35.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj11 testdata.475827 2026-03-08T23:43:35.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj12 testdata.475827 2026-03-08T23:43:35.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj13 testdata.475827 2026-03-08T23:43:35.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj14 testdata.475827 2026-03-08T23:43:35.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:673: TEST_pg_dump_objects_scrubbed: for i in `seq 1 $objects` 2026-03-08T23:43:35.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:675: TEST_pg_dump_objects_scrubbed: rados -p test put obj15 testdata.475827 2026-03-08T23:43:35.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:677: TEST_pg_dump_objects_scrubbed: rm -f testdata.475827 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:679: TEST_pg_dump_objects_scrubbed: local pgid=1.0 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:681: TEST_pg_dump_objects_scrubbed: pg_scrub 1.0 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1932: pg_scrub: local pgid=1.0 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1934: pg_scrub: wait_for_pg_clean 1.0 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:43:35.809 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:43:35.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:43:35.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:43:35.810 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:43:35.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:36.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:43:36.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836501 2026-03-08T23:43:36.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836501 2026-03-08T23:43:36.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501' 2026-03-08T23:43:36.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:36.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:43:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672974 2026-03-08T23:43:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672974 2026-03-08T23:43:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672974' 2026-03-08T23:43:36.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:43:36.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509449 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509449 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836501 1-42949672974 2-64424509449' 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836501 2026-03-08T23:43:36.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:36.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:43:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836501 2026-03-08T23:43:36.401 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:36.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836501 2026-03-08T23:43:36.402 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836501 2026-03-08T23:43:36.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836501' 2026-03-08T23:43:36.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:43:36.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836501 2026-03-08T23:43:36.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:43:37.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:43:37.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:43:37.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836501 2026-03-08T23:43:37.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:43:38.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:43:38.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:43:38.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836503 -lt 21474836501 2026-03-08T23:43:38.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:38.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672974 2026-03-08T23:43:38.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:38.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:43:38.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672974 2026-03-08T23:43:38.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:38.942 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672974 2026-03-08T23:43:38.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672974 2026-03-08T23:43:38.942 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672974' 2026-03-08T23:43:38.942 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:43:39.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672976 -lt 42949672974 2026-03-08T23:43:39.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:43:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509449 2026-03-08T23:43:39.117 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:43:39.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:43:39.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509449 2026-03-08T23:43:39.119 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:43:39.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509449 2026-03-08T23:43:39.120 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509449 2026-03-08T23:43:39.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509449' 2026-03-08T23:43:39.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509450 -lt 64424509449 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:43:39.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: get_last_scrub_stamp 1.0 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:43:39.385 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:43:39.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1935: pg_scrub: local last_scrub=2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:39.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1936: pg_scrub: ceph pg scrub 1.0 2026-03-08T23:43:39.726 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to scrub 2026-03-08T23:43:39.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1937: pg_scrub: wait_for_scrub 1.0 2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:39.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:43:39.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:43:39.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:43:31.466791+0000 '>' 2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:39.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:43:40.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:43:40.916 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:43:41.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:43:31.466791+0000 '>' 2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:41.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:43:42.089 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:43:42.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:43:40.301902+0000 '>' 2026-03-08T23:43:31.466791+0000 2026-03-08T23:43:42.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:43:42.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:682: TEST_pg_dump_objects_scrubbed: ceph pg 1.0 query 2026-03-08T23:43:42.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:682: TEST_pg_dump_objects_scrubbed: jq .info.stats.objects_scrubbed 2026-03-08T23:43:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:682: TEST_pg_dump_objects_scrubbed: test 15 = 15 2026-03-08T23:43:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:683: TEST_pg_dump_objects_scrubbed: perf_counters td/osd-scrub-test 3 2026-03-08T23:43:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:43:42.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:43:42.351 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:43:42.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:43:42.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:42.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:43:42.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.424 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.425 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.426 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.427 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:43:42.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.502 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.503 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.504 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 1, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 1, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.004000018, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.004000018 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 1, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.505 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 1, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:43:42.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.582 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.583 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:43:42.584 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:43:42.585 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:43:42.585 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:43:42.585 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:43:42.585 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:43:42.585 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:43:42.592 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:685: TEST_pg_dump_objects_scrubbed: teardown td/osd-scrub-test 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:42.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:42.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:42.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:42.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:42.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:42.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:42.714 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:42.714 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:42.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:42.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:42.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:42.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:42.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:42.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_pg_dump_objects_scrubbed ------------------ 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_pg_dump_objects_scrubbed ------------------' 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:42.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:42.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:42.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:42.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:42.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:42.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:42.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:42.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:42.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:42.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:42.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:42.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:42.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:42.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:42.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_pg_dump_objects_scrubbed ------------------ 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_scrub_abort ------------------- 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_pg_dump_objects_scrubbed ------------------' 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_scrub_abort -------------------' 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:43:42.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:43:42.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:43:42.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:43:42.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:43:42.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:43:42.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:43:42.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:43:42.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:43:42.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:43:42.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:43:42.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:43:42.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:43:42.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:43:42.749 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:43:42.749 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:43:42.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.750 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:43:42.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:43:42.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:43:42.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:43:42.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:43:42.751 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_scrub_abort ----------------------- 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_scrub_abort -----------------------' 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_scrub_abort td/osd-scrub-test 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:409: TEST_scrub_abort: local dir=td/osd-scrub-test 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:410: TEST_scrub_abort: _scrub_abort td/osd-scrub-test scrub 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:298: _scrub_abort: local dir=td/osd-scrub-test 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:299: _scrub_abort: local poolname=test 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:300: _scrub_abort: local OSDS=3 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:301: _scrub_abort: local objects=1000 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:302: _scrub_abort: local type=scrub 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:304: _scrub_abort: TESTDATA=testdata.475827 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:305: _scrub_abort: test scrub = scrub 2026-03-08T23:43:42.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:307: _scrub_abort: stopscrub=noscrub 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:308: _scrub_abort: check=noscrub 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:314: _scrub_abort: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:43:42.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:43:42.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:42.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:43:42.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:43:42.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:43:42.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:43:42.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:43:42.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:43:42.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:43:42.807 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:43:42.807 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:43:42.807 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:43:42.808 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:43:42.808 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.808 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.808 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:43:42.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:43:42.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:43:42.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:43:42.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:43:42.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:43:42.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:43:42.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:42.884 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:43:42.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:43:42.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:315: _scrub_abort: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:43:42.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:43.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:43.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:43:43.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:43:43.090 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: expr 3 - 1 2026-03-08T23:43:43.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: seq 0 2 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 0 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:43.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:43.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:43.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:43.093 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:43:43.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:43:43.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:43.096 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 56ffc90b-fa5b-4632-8763-0b2f1d1b42d0 2026-03-08T23:43:43.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=56ffc90b-fa5b-4632-8763-0b2f1d1b42d0 2026-03-08T23:43:43.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 56ffc90b-fa5b-4632-8763-0b2f1d1b42d0' 2026-03-08T23:43:43.096 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:43.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCvCa5poCMGBxAAdchWqzVeyJxPSYWA2G0pnQ== 2026-03-08T23:43:43.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCvCa5poCMGBxAAdchWqzVeyJxPSYWA2G0pnQ=="}' 2026-03-08T23:43:43.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 56ffc90b-fa5b-4632-8763-0b2f1d1b42d0 -i td/osd-scrub-test/0/new.json 2026-03-08T23:43:43.216 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:43.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:43:43.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQCvCa5poCMGBxAAdchWqzVeyJxPSYWA2G0pnQ== --osd-uuid 56ffc90b-fa5b-4632-8763-0b2f1d1b42d0 2026-03-08T23:43:43.254 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:43.259+0000 7f8acc2748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:43.259 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:43.267+0000 7f8acc2748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:43.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:43.267+0000 7f8acc2748c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:43.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:43.267+0000 7f8acc2748c0 -1 bdev(0x564c2814cc00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:43.260 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:43.267+0000 7f8acc2748c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:43:45.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:43:45.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:45.541 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:43:45.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:43:45.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:45.649 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:43:45.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:43:45.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:45.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:45.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:45.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:45.687 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:45.695+0000 7f3cc374e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:45.689 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:45.695+0000 7f3cc374e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:45.691 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:45.699+0000 7f3cc374e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:45.793 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:45.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:45.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:46.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:46.651+0000 7f3cc374e8c0 -1 Falling back to public interface 2026-03-08T23:43:46.974 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:46.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:46.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:46.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:46.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:46.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:47.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:47.620 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:47.627+0000 7f3cc374e8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2184325829,v1:127.0.0.1:6803/2184325829] [v2:127.0.0.1:6804/2184325829,v1:127.0.0.1:6805/2184325829] exists,up 56ffc90b-fa5b-4632-8763-0b2f1d1b42d0 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 1 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:48.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:48.330 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:43:48.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:43:48.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:48.333 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 5ded3e95-48b2-4203-bfc2-328418850e7a 2026-03-08T23:43:48.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5ded3e95-48b2-4203-bfc2-328418850e7a 2026-03-08T23:43:48.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 5ded3e95-48b2-4203-bfc2-328418850e7a' 2026-03-08T23:43:48.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:48.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC0Ca5pQ5kdFRAAydRnaYye3VCAv7ZuXs+j5w== 2026-03-08T23:43:48.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC0Ca5pQ5kdFRAAydRnaYye3VCAv7ZuXs+j5w=="}' 2026-03-08T23:43:48.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5ded3e95-48b2-4203-bfc2-328418850e7a -i td/osd-scrub-test/1/new.json 2026-03-08T23:43:48.513 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:48.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:43:48.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQC0Ca5pQ5kdFRAAydRnaYye3VCAv7ZuXs+j5w== --osd-uuid 5ded3e95-48b2-4203-bfc2-328418850e7a 2026-03-08T23:43:48.547 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:48.555+0000 7faac80ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:48.549 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:48.555+0000 7faac80ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:48.550 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:48.559+0000 7faac80ce8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:48.550 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:48.559+0000 7faac80ce8c0 -1 bdev(0x5586c22d3c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:48.551 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:48.559+0000 7faac80ce8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:43:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:43:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:50.815 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:43:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:43:50.815 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:51.035 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:43:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:43:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:51.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:51.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:51.051 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:51.059+0000 7f820eccc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:51.052 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:51.059+0000 7f820eccc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:51.054 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:51.059+0000 7f820eccc8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:51.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:51.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:51.515 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:51.523+0000 7f820eccc8c0 -1 Falling back to public interface 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:52.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:52.497 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:52.503+0000 7f820eccc8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:43:52.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:53.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:43:53.781 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2115298408,v1:127.0.0.1:6811/2115298408] [v2:127.0.0.1:6812/2115298408,v1:127.0.0.1:6813/2115298408] exists,up 5ded3e95-48b2-4203-bfc2-328418850e7a 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:316: _scrub_abort: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:322: _scrub_abort: run_osd td/osd-scrub-test 2 --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:43:53.782 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:43:53.783 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:43:53.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:43:53.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:43:53.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:43:53.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:43:53.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:43:53.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq' 2026-03-08T23:43:53.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:43:53.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:43:53.788 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 25081f65-78b1-4e3a-8af3-0591631a0376 2026-03-08T23:43:53.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=25081f65-78b1-4e3a-8af3-0591631a0376 2026-03-08T23:43:53.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 25081f65-78b1-4e3a-8af3-0591631a0376' 2026-03-08T23:43:53.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:43:53.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQC5Ca5pM+A6MBAA9yi+tPN/wMnkQxzHmSMO4A== 2026-03-08T23:43:53.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQC5Ca5pM+A6MBAA9yi+tPN/wMnkQxzHmSMO4A=="}' 2026-03-08T23:43:53.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 25081f65-78b1-4e3a-8af3-0591631a0376 -i td/osd-scrub-test/2/new.json 2026-03-08T23:43:53.967 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:53.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:43:53.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq --mkfs --key AQC5Ca5pM+A6MBAA9yi+tPN/wMnkQxzHmSMO4A== --osd-uuid 25081f65-78b1-4e3a-8af3-0591631a0376 2026-03-08T23:43:54.002 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:54.011+0000 7feabf16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:54.004 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:54.011+0000 7feabf16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:54.005 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:54.011+0000 7feabf16b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:54.005 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:54.011+0000 7feabf16b8c0 -1 bdev(0x559f67c91c00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:43:54.005 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:54.011+0000 7feabf16b8c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:43:56.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:43:56.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:43:56.521 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:43:56.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:43:56.521 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:43:56.742 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:43:56.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:43:56.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd_pool_default_pg_autoscale_mode=off --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_sleep=5.0 --osd_scrub_interval_randomize_ratio=0 --osd_op_queue=wpq 2026-03-08T23:43:56.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:43:56.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:43:56.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:43:56.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:56.767+0000 7f4dd6ace8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:56.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:56.771+0000 7f4dd6ace8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:56.766 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:56.771+0000 7f4dd6ace8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:56.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:57.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:57.975 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:57.983+0000 7f4dd6ace8c0 -1 Falling back to public interface 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:58.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:58.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:43:58.954 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:43:58.963+0000 7f4dd6ace8c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:43:59.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 14 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2708458405,v1:127.0.0.1:6819/2708458405] [v2:127.0.0.1:6820/2708458405,v1:127.0.0.1:6821/2708458405] exists,up 25081f65-78b1-4e3a-8af3-0591631a0376 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:330: _scrub_abort: create_pool test 1 1 2026-03-08T23:43:59.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:43:59.693 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:43:59.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:44:00.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:331: _scrub_abort: wait_for_clean 2026-03-08T23:44:00.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:44:00.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:44:00.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:44:00.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:44:00.716 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:44:00.716 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:44:00.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:44:00.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:44:00.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:44:00.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:44:00.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:44:00.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:44:00.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:44:00.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:00.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:00.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:00.964 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:00.964 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:00.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:00.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:00.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:01.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836494 2026-03-08T23:44:01.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836494 2026-03-08T23:44:01.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494' 2026-03-08T23:44:01.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:01.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:01.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672969 2026-03-08T23:44:01.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672969 2026-03-08T23:44:01.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672969' 2026-03-08T23:44:01.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:01.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:01.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542147 2026-03-08T23:44:01.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542147 2026-03-08T23:44:01.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836494 1-42949672969 2-60129542147' 2026-03-08T23:44:01.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:01.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836494 2026-03-08T23:44:01.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:01.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:01.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836494 2026-03-08T23:44:01.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:01.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836494 2026-03-08T23:44:01.232 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836494 2026-03-08T23:44:01.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836494' 2026-03-08T23:44:01.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:01.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836491 -lt 21474836494 2026-03-08T23:44:01.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:02.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:02.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:02.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836494 -lt 21474836494 2026-03-08T23:44:02.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:02.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672969 2026-03-08T23:44:02.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:02.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:02.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672969 2026-03-08T23:44:02.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:02.595 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672969 2026-03-08T23:44:02.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672969 2026-03-08T23:44:02.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672969' 2026-03-08T23:44:02.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:02.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672970 -lt 42949672969 2026-03-08T23:44:02.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:02.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542147 2026-03-08T23:44:02.766 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:02.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:02.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542147 2026-03-08T23:44:02.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:02.769 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542147 2026-03-08T23:44:02.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542147 2026-03-08T23:44:02.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542147' 2026-03-08T23:44:02.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:02.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542147 -lt 60129542147 2026-03-08T23:44:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:44:02.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:44:02.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:44:03.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:44:03.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:44:03.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:44:03.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:44:03.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:44:03.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:44:03.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:44:03.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:44:03.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: ceph osd dump 2026-03-08T23:44:03.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: awk '{ print $2 }' 2026-03-08T23:44:03.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: grep '^pool.*['\'']test['\'']' 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:332: _scrub_abort: poolid=1 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:334: _scrub_abort: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 4.0776e-05 s, 25.3 MB/s 2026-03-08T23:44:03.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: seq 1 1000 2026-03-08T23:44:03.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj1 testdata.475827 2026-03-08T23:44:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj2 testdata.475827 2026-03-08T23:44:03.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj3 testdata.475827 2026-03-08T23:44:03.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj4 testdata.475827 2026-03-08T23:44:03.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj5 testdata.475827 2026-03-08T23:44:03.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj6 testdata.475827 2026-03-08T23:44:03.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj7 testdata.475827 2026-03-08T23:44:03.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj8 testdata.475827 2026-03-08T23:44:03.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj9 testdata.475827 2026-03-08T23:44:03.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj10 testdata.475827 2026-03-08T23:44:03.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj11 testdata.475827 2026-03-08T23:44:03.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:03.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj12 testdata.475827 2026-03-08T23:44:04.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj13 testdata.475827 2026-03-08T23:44:04.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj14 testdata.475827 2026-03-08T23:44:04.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj15 testdata.475827 2026-03-08T23:44:04.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj16 testdata.475827 2026-03-08T23:44:04.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj17 testdata.475827 2026-03-08T23:44:04.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj18 testdata.475827 2026-03-08T23:44:04.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.131 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj19 testdata.475827 2026-03-08T23:44:04.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj20 testdata.475827 2026-03-08T23:44:04.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj21 testdata.475827 2026-03-08T23:44:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.198 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj22 testdata.475827 2026-03-08T23:44:04.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj23 testdata.475827 2026-03-08T23:44:04.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj24 testdata.475827 2026-03-08T23:44:04.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj25 testdata.475827 2026-03-08T23:44:04.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj26 testdata.475827 2026-03-08T23:44:04.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.312 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj27 testdata.475827 2026-03-08T23:44:04.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj28 testdata.475827 2026-03-08T23:44:04.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj29 testdata.475827 2026-03-08T23:44:04.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj30 testdata.475827 2026-03-08T23:44:04.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj31 testdata.475827 2026-03-08T23:44:04.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj32 testdata.475827 2026-03-08T23:44:04.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj33 testdata.475827 2026-03-08T23:44:04.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj34 testdata.475827 2026-03-08T23:44:04.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj35 testdata.475827 2026-03-08T23:44:04.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj36 testdata.475827 2026-03-08T23:44:04.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj37 testdata.475827 2026-03-08T23:44:04.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj38 testdata.475827 2026-03-08T23:44:04.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj39 testdata.475827 2026-03-08T23:44:04.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj40 testdata.475827 2026-03-08T23:44:04.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj41 testdata.475827 2026-03-08T23:44:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj42 testdata.475827 2026-03-08T23:44:04.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj43 testdata.475827 2026-03-08T23:44:04.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.705 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj44 testdata.475827 2026-03-08T23:44:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj45 testdata.475827 2026-03-08T23:44:04.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj46 testdata.475827 2026-03-08T23:44:04.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj47 testdata.475827 2026-03-08T23:44:04.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj48 testdata.475827 2026-03-08T23:44:04.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj49 testdata.475827 2026-03-08T23:44:04.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj50 testdata.475827 2026-03-08T23:44:04.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj51 testdata.475827 2026-03-08T23:44:04.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj52 testdata.475827 2026-03-08T23:44:04.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:04.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj53 testdata.475827 2026-03-08T23:44:05.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj54 testdata.475827 2026-03-08T23:44:05.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj55 testdata.475827 2026-03-08T23:44:05.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj56 testdata.475827 2026-03-08T23:44:05.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj57 testdata.475827 2026-03-08T23:44:05.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj58 testdata.475827 2026-03-08T23:44:05.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj59 testdata.475827 2026-03-08T23:44:05.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj60 testdata.475827 2026-03-08T23:44:05.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj61 testdata.475827 2026-03-08T23:44:05.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj62 testdata.475827 2026-03-08T23:44:05.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj63 testdata.475827 2026-03-08T23:44:05.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj64 testdata.475827 2026-03-08T23:44:05.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj65 testdata.475827 2026-03-08T23:44:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj66 testdata.475827 2026-03-08T23:44:05.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj67 testdata.475827 2026-03-08T23:44:05.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj68 testdata.475827 2026-03-08T23:44:05.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj69 testdata.475827 2026-03-08T23:44:05.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj70 testdata.475827 2026-03-08T23:44:05.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj71 testdata.475827 2026-03-08T23:44:05.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj72 testdata.475827 2026-03-08T23:44:05.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj73 testdata.475827 2026-03-08T23:44:05.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj74 testdata.475827 2026-03-08T23:44:05.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj75 testdata.475827 2026-03-08T23:44:05.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj76 testdata.475827 2026-03-08T23:44:05.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj77 testdata.475827 2026-03-08T23:44:05.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj78 testdata.475827 2026-03-08T23:44:05.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj79 testdata.475827 2026-03-08T23:44:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj80 testdata.475827 2026-03-08T23:44:05.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj81 testdata.475827 2026-03-08T23:44:05.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj82 testdata.475827 2026-03-08T23:44:05.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj83 testdata.475827 2026-03-08T23:44:05.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj84 testdata.475827 2026-03-08T23:44:05.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj85 testdata.475827 2026-03-08T23:44:05.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj86 testdata.475827 2026-03-08T23:44:05.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj87 testdata.475827 2026-03-08T23:44:05.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj88 testdata.475827 2026-03-08T23:44:05.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj89 testdata.475827 2026-03-08T23:44:05.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj90 testdata.475827 2026-03-08T23:44:05.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj91 testdata.475827 2026-03-08T23:44:05.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:05.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj92 testdata.475827 2026-03-08T23:44:06.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj93 testdata.475827 2026-03-08T23:44:06.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj94 testdata.475827 2026-03-08T23:44:06.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj95 testdata.475827 2026-03-08T23:44:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj96 testdata.475827 2026-03-08T23:44:06.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj97 testdata.475827 2026-03-08T23:44:06.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.117 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj98 testdata.475827 2026-03-08T23:44:06.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj99 testdata.475827 2026-03-08T23:44:06.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj100 testdata.475827 2026-03-08T23:44:06.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj101 testdata.475827 2026-03-08T23:44:06.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj102 testdata.475827 2026-03-08T23:44:06.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj103 testdata.475827 2026-03-08T23:44:06.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj104 testdata.475827 2026-03-08T23:44:06.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj105 testdata.475827 2026-03-08T23:44:06.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj106 testdata.475827 2026-03-08T23:44:06.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj107 testdata.475827 2026-03-08T23:44:06.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj108 testdata.475827 2026-03-08T23:44:06.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj109 testdata.475827 2026-03-08T23:44:06.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj110 testdata.475827 2026-03-08T23:44:06.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj111 testdata.475827 2026-03-08T23:44:06.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj112 testdata.475827 2026-03-08T23:44:06.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj113 testdata.475827 2026-03-08T23:44:06.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj114 testdata.475827 2026-03-08T23:44:06.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj115 testdata.475827 2026-03-08T23:44:06.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj116 testdata.475827 2026-03-08T23:44:06.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj117 testdata.475827 2026-03-08T23:44:06.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj118 testdata.475827 2026-03-08T23:44:06.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj119 testdata.475827 2026-03-08T23:44:06.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj120 testdata.475827 2026-03-08T23:44:06.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj121 testdata.475827 2026-03-08T23:44:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj122 testdata.475827 2026-03-08T23:44:06.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.695 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj123 testdata.475827 2026-03-08T23:44:06.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj124 testdata.475827 2026-03-08T23:44:06.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.740 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj125 testdata.475827 2026-03-08T23:44:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj126 testdata.475827 2026-03-08T23:44:06.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj127 testdata.475827 2026-03-08T23:44:06.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj128 testdata.475827 2026-03-08T23:44:06.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj129 testdata.475827 2026-03-08T23:44:06.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj130 testdata.475827 2026-03-08T23:44:06.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj131 testdata.475827 2026-03-08T23:44:06.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj132 testdata.475827 2026-03-08T23:44:06.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj133 testdata.475827 2026-03-08T23:44:06.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj134 testdata.475827 2026-03-08T23:44:06.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj135 testdata.475827 2026-03-08T23:44:06.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:06.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj136 testdata.475827 2026-03-08T23:44:07.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj137 testdata.475827 2026-03-08T23:44:07.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj138 testdata.475827 2026-03-08T23:44:07.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj139 testdata.475827 2026-03-08T23:44:07.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj140 testdata.475827 2026-03-08T23:44:07.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj141 testdata.475827 2026-03-08T23:44:07.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj142 testdata.475827 2026-03-08T23:44:07.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj143 testdata.475827 2026-03-08T23:44:07.155 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.156 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj144 testdata.475827 2026-03-08T23:44:07.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj145 testdata.475827 2026-03-08T23:44:07.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj146 testdata.475827 2026-03-08T23:44:07.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj147 testdata.475827 2026-03-08T23:44:07.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj148 testdata.475827 2026-03-08T23:44:07.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj149 testdata.475827 2026-03-08T23:44:07.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.284 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj150 testdata.475827 2026-03-08T23:44:07.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj151 testdata.475827 2026-03-08T23:44:07.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj152 testdata.475827 2026-03-08T23:44:07.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj153 testdata.475827 2026-03-08T23:44:07.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj154 testdata.475827 2026-03-08T23:44:07.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj155 testdata.475827 2026-03-08T23:44:07.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj156 testdata.475827 2026-03-08T23:44:07.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj157 testdata.475827 2026-03-08T23:44:07.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj158 testdata.475827 2026-03-08T23:44:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj159 testdata.475827 2026-03-08T23:44:07.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj160 testdata.475827 2026-03-08T23:44:07.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj161 testdata.475827 2026-03-08T23:44:07.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.549 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj162 testdata.475827 2026-03-08T23:44:07.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj163 testdata.475827 2026-03-08T23:44:07.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj164 testdata.475827 2026-03-08T23:44:07.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj165 testdata.475827 2026-03-08T23:44:07.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.638 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj166 testdata.475827 2026-03-08T23:44:07.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj167 testdata.475827 2026-03-08T23:44:07.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj168 testdata.475827 2026-03-08T23:44:07.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj169 testdata.475827 2026-03-08T23:44:07.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj170 testdata.475827 2026-03-08T23:44:07.743 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj171 testdata.475827 2026-03-08T23:44:07.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj172 testdata.475827 2026-03-08T23:44:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj173 testdata.475827 2026-03-08T23:44:07.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj174 testdata.475827 2026-03-08T23:44:07.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj175 testdata.475827 2026-03-08T23:44:07.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj176 testdata.475827 2026-03-08T23:44:07.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj177 testdata.475827 2026-03-08T23:44:07.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj178 testdata.475827 2026-03-08T23:44:07.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj179 testdata.475827 2026-03-08T23:44:07.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj180 testdata.475827 2026-03-08T23:44:07.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj181 testdata.475827 2026-03-08T23:44:07.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:07.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj182 testdata.475827 2026-03-08T23:44:08.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj183 testdata.475827 2026-03-08T23:44:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj184 testdata.475827 2026-03-08T23:44:08.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj185 testdata.475827 2026-03-08T23:44:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj186 testdata.475827 2026-03-08T23:44:08.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj187 testdata.475827 2026-03-08T23:44:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj188 testdata.475827 2026-03-08T23:44:08.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj189 testdata.475827 2026-03-08T23:44:08.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj190 testdata.475827 2026-03-08T23:44:08.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj191 testdata.475827 2026-03-08T23:44:08.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj192 testdata.475827 2026-03-08T23:44:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj193 testdata.475827 2026-03-08T23:44:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj194 testdata.475827 2026-03-08T23:44:08.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj195 testdata.475827 2026-03-08T23:44:08.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj196 testdata.475827 2026-03-08T23:44:08.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj197 testdata.475827 2026-03-08T23:44:08.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.354 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj198 testdata.475827 2026-03-08T23:44:08.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj199 testdata.475827 2026-03-08T23:44:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj200 testdata.475827 2026-03-08T23:44:08.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.421 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj201 testdata.475827 2026-03-08T23:44:08.443 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj202 testdata.475827 2026-03-08T23:44:08.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj203 testdata.475827 2026-03-08T23:44:08.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj204 testdata.475827 2026-03-08T23:44:08.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj205 testdata.475827 2026-03-08T23:44:08.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj206 testdata.475827 2026-03-08T23:44:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj207 testdata.475827 2026-03-08T23:44:08.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj208 testdata.475827 2026-03-08T23:44:08.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj209 testdata.475827 2026-03-08T23:44:08.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj210 testdata.475827 2026-03-08T23:44:08.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj211 testdata.475827 2026-03-08T23:44:08.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj212 testdata.475827 2026-03-08T23:44:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj213 testdata.475827 2026-03-08T23:44:08.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.711 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj214 testdata.475827 2026-03-08T23:44:08.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj215 testdata.475827 2026-03-08T23:44:08.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj216 testdata.475827 2026-03-08T23:44:08.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj217 testdata.475827 2026-03-08T23:44:08.798 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj218 testdata.475827 2026-03-08T23:44:08.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj219 testdata.475827 2026-03-08T23:44:08.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj220 testdata.475827 2026-03-08T23:44:08.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj221 testdata.475827 2026-03-08T23:44:08.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj222 testdata.475827 2026-03-08T23:44:08.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj223 testdata.475827 2026-03-08T23:44:08.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj224 testdata.475827 2026-03-08T23:44:08.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj225 testdata.475827 2026-03-08T23:44:08.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj226 testdata.475827 2026-03-08T23:44:08.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:08.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj227 testdata.475827 2026-03-08T23:44:09.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj228 testdata.475827 2026-03-08T23:44:09.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj229 testdata.475827 2026-03-08T23:44:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj230 testdata.475827 2026-03-08T23:44:09.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj231 testdata.475827 2026-03-08T23:44:09.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj232 testdata.475827 2026-03-08T23:44:09.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj233 testdata.475827 2026-03-08T23:44:09.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj234 testdata.475827 2026-03-08T23:44:09.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj235 testdata.475827 2026-03-08T23:44:09.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj236 testdata.475827 2026-03-08T23:44:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj237 testdata.475827 2026-03-08T23:44:09.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj238 testdata.475827 2026-03-08T23:44:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj239 testdata.475827 2026-03-08T23:44:09.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj240 testdata.475827 2026-03-08T23:44:09.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj241 testdata.475827 2026-03-08T23:44:09.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj242 testdata.475827 2026-03-08T23:44:09.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj243 testdata.475827 2026-03-08T23:44:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj244 testdata.475827 2026-03-08T23:44:09.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj245 testdata.475827 2026-03-08T23:44:09.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj246 testdata.475827 2026-03-08T23:44:09.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj247 testdata.475827 2026-03-08T23:44:09.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj248 testdata.475827 2026-03-08T23:44:09.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj249 testdata.475827 2026-03-08T23:44:09.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj250 testdata.475827 2026-03-08T23:44:09.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj251 testdata.475827 2026-03-08T23:44:09.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj252 testdata.475827 2026-03-08T23:44:09.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj253 testdata.475827 2026-03-08T23:44:09.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj254 testdata.475827 2026-03-08T23:44:09.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj255 testdata.475827 2026-03-08T23:44:09.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj256 testdata.475827 2026-03-08T23:44:09.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.671 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj257 testdata.475827 2026-03-08T23:44:09.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj258 testdata.475827 2026-03-08T23:44:09.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj259 testdata.475827 2026-03-08T23:44:09.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj260 testdata.475827 2026-03-08T23:44:09.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj261 testdata.475827 2026-03-08T23:44:09.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj262 testdata.475827 2026-03-08T23:44:09.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.821 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj263 testdata.475827 2026-03-08T23:44:09.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj264 testdata.475827 2026-03-08T23:44:09.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj265 testdata.475827 2026-03-08T23:44:09.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj266 testdata.475827 2026-03-08T23:44:09.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj267 testdata.475827 2026-03-08T23:44:09.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.947 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj268 testdata.475827 2026-03-08T23:44:09.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj269 testdata.475827 2026-03-08T23:44:09.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:09.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj270 testdata.475827 2026-03-08T23:44:10.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj271 testdata.475827 2026-03-08T23:44:10.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj272 testdata.475827 2026-03-08T23:44:10.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj273 testdata.475827 2026-03-08T23:44:10.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.096 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj274 testdata.475827 2026-03-08T23:44:10.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj275 testdata.475827 2026-03-08T23:44:10.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj276 testdata.475827 2026-03-08T23:44:10.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj277 testdata.475827 2026-03-08T23:44:10.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj278 testdata.475827 2026-03-08T23:44:10.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj279 testdata.475827 2026-03-08T23:44:10.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj280 testdata.475827 2026-03-08T23:44:10.258 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj281 testdata.475827 2026-03-08T23:44:10.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj282 testdata.475827 2026-03-08T23:44:10.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj283 testdata.475827 2026-03-08T23:44:10.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj284 testdata.475827 2026-03-08T23:44:10.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj285 testdata.475827 2026-03-08T23:44:10.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj286 testdata.475827 2026-03-08T23:44:10.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj287 testdata.475827 2026-03-08T23:44:10.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj288 testdata.475827 2026-03-08T23:44:10.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj289 testdata.475827 2026-03-08T23:44:10.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj290 testdata.475827 2026-03-08T23:44:10.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj291 testdata.475827 2026-03-08T23:44:10.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj292 testdata.475827 2026-03-08T23:44:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj293 testdata.475827 2026-03-08T23:44:10.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj294 testdata.475827 2026-03-08T23:44:10.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj295 testdata.475827 2026-03-08T23:44:10.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj296 testdata.475827 2026-03-08T23:44:10.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj297 testdata.475827 2026-03-08T23:44:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj298 testdata.475827 2026-03-08T23:44:10.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj299 testdata.475827 2026-03-08T23:44:10.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj300 testdata.475827 2026-03-08T23:44:10.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj301 testdata.475827 2026-03-08T23:44:10.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj302 testdata.475827 2026-03-08T23:44:10.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj303 testdata.475827 2026-03-08T23:44:10.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj304 testdata.475827 2026-03-08T23:44:10.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj305 testdata.475827 2026-03-08T23:44:10.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj306 testdata.475827 2026-03-08T23:44:10.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj307 testdata.475827 2026-03-08T23:44:10.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj308 testdata.475827 2026-03-08T23:44:10.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj309 testdata.475827 2026-03-08T23:44:10.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:10.994 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj310 testdata.475827 2026-03-08T23:44:11.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj311 testdata.475827 2026-03-08T23:44:11.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj312 testdata.475827 2026-03-08T23:44:11.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj313 testdata.475827 2026-03-08T23:44:11.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj314 testdata.475827 2026-03-08T23:44:11.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj315 testdata.475827 2026-03-08T23:44:11.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj316 testdata.475827 2026-03-08T23:44:11.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj317 testdata.475827 2026-03-08T23:44:11.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj318 testdata.475827 2026-03-08T23:44:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj319 testdata.475827 2026-03-08T23:44:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj320 testdata.475827 2026-03-08T23:44:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj321 testdata.475827 2026-03-08T23:44:11.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj322 testdata.475827 2026-03-08T23:44:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj323 testdata.475827 2026-03-08T23:44:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj324 testdata.475827 2026-03-08T23:44:11.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj325 testdata.475827 2026-03-08T23:44:11.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj326 testdata.475827 2026-03-08T23:44:11.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj327 testdata.475827 2026-03-08T23:44:11.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj328 testdata.475827 2026-03-08T23:44:11.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj329 testdata.475827 2026-03-08T23:44:11.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj330 testdata.475827 2026-03-08T23:44:11.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj331 testdata.475827 2026-03-08T23:44:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj332 testdata.475827 2026-03-08T23:44:11.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj333 testdata.475827 2026-03-08T23:44:11.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj334 testdata.475827 2026-03-08T23:44:11.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj335 testdata.475827 2026-03-08T23:44:11.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj336 testdata.475827 2026-03-08T23:44:11.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj337 testdata.475827 2026-03-08T23:44:11.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj338 testdata.475827 2026-03-08T23:44:11.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj339 testdata.475827 2026-03-08T23:44:11.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj340 testdata.475827 2026-03-08T23:44:11.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj341 testdata.475827 2026-03-08T23:44:11.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj342 testdata.475827 2026-03-08T23:44:11.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj343 testdata.475827 2026-03-08T23:44:11.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj344 testdata.475827 2026-03-08T23:44:11.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj345 testdata.475827 2026-03-08T23:44:11.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj346 testdata.475827 2026-03-08T23:44:11.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj347 testdata.475827 2026-03-08T23:44:11.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj348 testdata.475827 2026-03-08T23:44:11.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj349 testdata.475827 2026-03-08T23:44:11.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj350 testdata.475827 2026-03-08T23:44:11.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:11.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj351 testdata.475827 2026-03-08T23:44:12.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj352 testdata.475827 2026-03-08T23:44:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj353 testdata.475827 2026-03-08T23:44:12.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj354 testdata.475827 2026-03-08T23:44:12.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj355 testdata.475827 2026-03-08T23:44:12.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.110 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj356 testdata.475827 2026-03-08T23:44:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj357 testdata.475827 2026-03-08T23:44:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj358 testdata.475827 2026-03-08T23:44:12.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj359 testdata.475827 2026-03-08T23:44:12.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj360 testdata.475827 2026-03-08T23:44:12.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj361 testdata.475827 2026-03-08T23:44:12.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj362 testdata.475827 2026-03-08T23:44:12.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.278 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj363 testdata.475827 2026-03-08T23:44:12.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj364 testdata.475827 2026-03-08T23:44:12.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj365 testdata.475827 2026-03-08T23:44:12.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj366 testdata.475827 2026-03-08T23:44:12.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj367 testdata.475827 2026-03-08T23:44:12.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj368 testdata.475827 2026-03-08T23:44:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj369 testdata.475827 2026-03-08T23:44:12.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj370 testdata.475827 2026-03-08T23:44:12.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj371 testdata.475827 2026-03-08T23:44:12.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj372 testdata.475827 2026-03-08T23:44:12.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj373 testdata.475827 2026-03-08T23:44:12.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.550 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj374 testdata.475827 2026-03-08T23:44:12.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj375 testdata.475827 2026-03-08T23:44:12.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj376 testdata.475827 2026-03-08T23:44:12.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj377 testdata.475827 2026-03-08T23:44:12.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj378 testdata.475827 2026-03-08T23:44:12.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj379 testdata.475827 2026-03-08T23:44:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj380 testdata.475827 2026-03-08T23:44:12.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj381 testdata.475827 2026-03-08T23:44:12.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj382 testdata.475827 2026-03-08T23:44:12.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.764 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj383 testdata.475827 2026-03-08T23:44:12.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj384 testdata.475827 2026-03-08T23:44:12.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj385 testdata.475827 2026-03-08T23:44:12.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.848 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj386 testdata.475827 2026-03-08T23:44:12.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj387 testdata.475827 2026-03-08T23:44:12.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj388 testdata.475827 2026-03-08T23:44:12.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj389 testdata.475827 2026-03-08T23:44:12.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj390 testdata.475827 2026-03-08T23:44:12.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:12.978 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj391 testdata.475827 2026-03-08T23:44:13.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj392 testdata.475827 2026-03-08T23:44:13.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj393 testdata.475827 2026-03-08T23:44:13.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj394 testdata.475827 2026-03-08T23:44:13.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj395 testdata.475827 2026-03-08T23:44:13.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj396 testdata.475827 2026-03-08T23:44:13.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj397 testdata.475827 2026-03-08T23:44:13.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj398 testdata.475827 2026-03-08T23:44:13.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj399 testdata.475827 2026-03-08T23:44:13.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj400 testdata.475827 2026-03-08T23:44:13.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj401 testdata.475827 2026-03-08T23:44:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj402 testdata.475827 2026-03-08T23:44:13.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.283 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj403 testdata.475827 2026-03-08T23:44:13.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj404 testdata.475827 2026-03-08T23:44:13.331 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj405 testdata.475827 2026-03-08T23:44:13.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj406 testdata.475827 2026-03-08T23:44:13.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj407 testdata.475827 2026-03-08T23:44:13.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj408 testdata.475827 2026-03-08T23:44:13.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj409 testdata.475827 2026-03-08T23:44:13.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj410 testdata.475827 2026-03-08T23:44:13.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj411 testdata.475827 2026-03-08T23:44:13.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj412 testdata.475827 2026-03-08T23:44:13.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj413 testdata.475827 2026-03-08T23:44:13.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj414 testdata.475827 2026-03-08T23:44:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj415 testdata.475827 2026-03-08T23:44:13.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj416 testdata.475827 2026-03-08T23:44:13.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj417 testdata.475827 2026-03-08T23:44:13.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj418 testdata.475827 2026-03-08T23:44:13.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj419 testdata.475827 2026-03-08T23:44:13.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj420 testdata.475827 2026-03-08T23:44:13.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj421 testdata.475827 2026-03-08T23:44:13.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj422 testdata.475827 2026-03-08T23:44:13.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj423 testdata.475827 2026-03-08T23:44:13.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj424 testdata.475827 2026-03-08T23:44:13.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj425 testdata.475827 2026-03-08T23:44:13.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:13.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj426 testdata.475827 2026-03-08T23:44:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj427 testdata.475827 2026-03-08T23:44:14.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj428 testdata.475827 2026-03-08T23:44:14.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj429 testdata.475827 2026-03-08T23:44:14.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj430 testdata.475827 2026-03-08T23:44:14.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj431 testdata.475827 2026-03-08T23:44:14.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj432 testdata.475827 2026-03-08T23:44:14.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj433 testdata.475827 2026-03-08T23:44:14.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj434 testdata.475827 2026-03-08T23:44:14.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj435 testdata.475827 2026-03-08T23:44:14.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj436 testdata.475827 2026-03-08T23:44:14.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj437 testdata.475827 2026-03-08T23:44:14.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj438 testdata.475827 2026-03-08T23:44:14.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.310 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj439 testdata.475827 2026-03-08T23:44:14.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj440 testdata.475827 2026-03-08T23:44:14.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj441 testdata.475827 2026-03-08T23:44:14.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj442 testdata.475827 2026-03-08T23:44:14.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj443 testdata.475827 2026-03-08T23:44:14.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj444 testdata.475827 2026-03-08T23:44:14.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj445 testdata.475827 2026-03-08T23:44:14.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj446 testdata.475827 2026-03-08T23:44:14.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj447 testdata.475827 2026-03-08T23:44:14.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj448 testdata.475827 2026-03-08T23:44:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj449 testdata.475827 2026-03-08T23:44:14.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj450 testdata.475827 2026-03-08T23:44:14.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj451 testdata.475827 2026-03-08T23:44:14.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj452 testdata.475827 2026-03-08T23:44:14.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj453 testdata.475827 2026-03-08T23:44:14.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj454 testdata.475827 2026-03-08T23:44:14.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj455 testdata.475827 2026-03-08T23:44:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj456 testdata.475827 2026-03-08T23:44:14.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj457 testdata.475827 2026-03-08T23:44:14.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj458 testdata.475827 2026-03-08T23:44:14.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj459 testdata.475827 2026-03-08T23:44:14.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj460 testdata.475827 2026-03-08T23:44:14.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj461 testdata.475827 2026-03-08T23:44:14.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj462 testdata.475827 2026-03-08T23:44:14.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.899 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj463 testdata.475827 2026-03-08T23:44:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj464 testdata.475827 2026-03-08T23:44:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj465 testdata.475827 2026-03-08T23:44:14.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj466 testdata.475827 2026-03-08T23:44:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:14.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj467 testdata.475827 2026-03-08T23:44:15.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj468 testdata.475827 2026-03-08T23:44:15.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.050 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj469 testdata.475827 2026-03-08T23:44:15.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj470 testdata.475827 2026-03-08T23:44:15.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj471 testdata.475827 2026-03-08T23:44:15.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj472 testdata.475827 2026-03-08T23:44:15.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.154 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj473 testdata.475827 2026-03-08T23:44:15.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj474 testdata.475827 2026-03-08T23:44:15.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj475 testdata.475827 2026-03-08T23:44:15.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj476 testdata.475827 2026-03-08T23:44:15.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj477 testdata.475827 2026-03-08T23:44:15.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj478 testdata.475827 2026-03-08T23:44:15.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.289 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj479 testdata.475827 2026-03-08T23:44:15.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.311 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj480 testdata.475827 2026-03-08T23:44:15.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj481 testdata.475827 2026-03-08T23:44:15.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj482 testdata.475827 2026-03-08T23:44:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj483 testdata.475827 2026-03-08T23:44:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj484 testdata.475827 2026-03-08T23:44:15.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj485 testdata.475827 2026-03-08T23:44:15.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj486 testdata.475827 2026-03-08T23:44:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj487 testdata.475827 2026-03-08T23:44:15.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj488 testdata.475827 2026-03-08T23:44:15.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj489 testdata.475827 2026-03-08T23:44:15.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj490 testdata.475827 2026-03-08T23:44:15.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj491 testdata.475827 2026-03-08T23:44:15.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj492 testdata.475827 2026-03-08T23:44:15.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj493 testdata.475827 2026-03-08T23:44:15.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj494 testdata.475827 2026-03-08T23:44:15.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.698 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj495 testdata.475827 2026-03-08T23:44:15.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj496 testdata.475827 2026-03-08T23:44:15.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj497 testdata.475827 2026-03-08T23:44:15.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj498 testdata.475827 2026-03-08T23:44:15.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj499 testdata.475827 2026-03-08T23:44:15.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj500 testdata.475827 2026-03-08T23:44:15.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj501 testdata.475827 2026-03-08T23:44:15.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.865 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj502 testdata.475827 2026-03-08T23:44:15.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj503 testdata.475827 2026-03-08T23:44:15.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj504 testdata.475827 2026-03-08T23:44:15.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj505 testdata.475827 2026-03-08T23:44:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj506 testdata.475827 2026-03-08T23:44:15.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj507 testdata.475827 2026-03-08T23:44:15.997 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:15.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj508 testdata.475827 2026-03-08T23:44:16.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj509 testdata.475827 2026-03-08T23:44:16.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj510 testdata.475827 2026-03-08T23:44:16.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj511 testdata.475827 2026-03-08T23:44:16.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj512 testdata.475827 2026-03-08T23:44:16.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj513 testdata.475827 2026-03-08T23:44:16.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj514 testdata.475827 2026-03-08T23:44:16.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj515 testdata.475827 2026-03-08T23:44:16.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj516 testdata.475827 2026-03-08T23:44:16.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj517 testdata.475827 2026-03-08T23:44:16.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj518 testdata.475827 2026-03-08T23:44:16.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj519 testdata.475827 2026-03-08T23:44:16.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj520 testdata.475827 2026-03-08T23:44:16.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj521 testdata.475827 2026-03-08T23:44:16.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj522 testdata.475827 2026-03-08T23:44:16.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj523 testdata.475827 2026-03-08T23:44:16.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj524 testdata.475827 2026-03-08T23:44:16.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj525 testdata.475827 2026-03-08T23:44:16.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj526 testdata.475827 2026-03-08T23:44:16.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj527 testdata.475827 2026-03-08T23:44:16.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj528 testdata.475827 2026-03-08T23:44:16.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj529 testdata.475827 2026-03-08T23:44:16.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:16.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj530 testdata.475827 2026-03-08T23:44:17.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj531 testdata.475827 2026-03-08T23:44:17.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj532 testdata.475827 2026-03-08T23:44:17.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj533 testdata.475827 2026-03-08T23:44:17.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj534 testdata.475827 2026-03-08T23:44:17.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.100 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj535 testdata.475827 2026-03-08T23:44:17.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj536 testdata.475827 2026-03-08T23:44:17.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj537 testdata.475827 2026-03-08T23:44:17.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj538 testdata.475827 2026-03-08T23:44:17.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj539 testdata.475827 2026-03-08T23:44:17.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj540 testdata.475827 2026-03-08T23:44:17.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj541 testdata.475827 2026-03-08T23:44:17.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj542 testdata.475827 2026-03-08T23:44:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj543 testdata.475827 2026-03-08T23:44:17.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj544 testdata.475827 2026-03-08T23:44:17.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.337 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj545 testdata.475827 2026-03-08T23:44:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.363 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj546 testdata.475827 2026-03-08T23:44:17.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.388 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj547 testdata.475827 2026-03-08T23:44:17.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj548 testdata.475827 2026-03-08T23:44:17.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.434 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj549 testdata.475827 2026-03-08T23:44:17.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj550 testdata.475827 2026-03-08T23:44:17.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj551 testdata.475827 2026-03-08T23:44:17.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj552 testdata.475827 2026-03-08T23:44:17.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj553 testdata.475827 2026-03-08T23:44:17.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj554 testdata.475827 2026-03-08T23:44:17.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj555 testdata.475827 2026-03-08T23:44:17.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj556 testdata.475827 2026-03-08T23:44:17.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj557 testdata.475827 2026-03-08T23:44:17.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj558 testdata.475827 2026-03-08T23:44:17.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj559 testdata.475827 2026-03-08T23:44:17.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj560 testdata.475827 2026-03-08T23:44:17.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj561 testdata.475827 2026-03-08T23:44:17.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj562 testdata.475827 2026-03-08T23:44:17.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.759 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj563 testdata.475827 2026-03-08T23:44:17.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj564 testdata.475827 2026-03-08T23:44:17.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.811 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj565 testdata.475827 2026-03-08T23:44:17.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj566 testdata.475827 2026-03-08T23:44:17.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj567 testdata.475827 2026-03-08T23:44:17.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj568 testdata.475827 2026-03-08T23:44:17.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj569 testdata.475827 2026-03-08T23:44:17.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj570 testdata.475827 2026-03-08T23:44:17.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj571 testdata.475827 2026-03-08T23:44:17.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj572 testdata.475827 2026-03-08T23:44:17.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:17.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj573 testdata.475827 2026-03-08T23:44:18.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj574 testdata.475827 2026-03-08T23:44:18.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj575 testdata.475827 2026-03-08T23:44:18.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj576 testdata.475827 2026-03-08T23:44:18.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj577 testdata.475827 2026-03-08T23:44:18.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj578 testdata.475827 2026-03-08T23:44:18.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj579 testdata.475827 2026-03-08T23:44:18.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj580 testdata.475827 2026-03-08T23:44:18.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj581 testdata.475827 2026-03-08T23:44:18.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj582 testdata.475827 2026-03-08T23:44:18.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj583 testdata.475827 2026-03-08T23:44:18.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj584 testdata.475827 2026-03-08T23:44:18.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj585 testdata.475827 2026-03-08T23:44:18.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj586 testdata.475827 2026-03-08T23:44:18.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.332 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj587 testdata.475827 2026-03-08T23:44:18.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.355 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj588 testdata.475827 2026-03-08T23:44:18.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj589 testdata.475827 2026-03-08T23:44:18.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj590 testdata.475827 2026-03-08T23:44:18.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj591 testdata.475827 2026-03-08T23:44:18.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj592 testdata.475827 2026-03-08T23:44:18.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj593 testdata.475827 2026-03-08T23:44:18.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj594 testdata.475827 2026-03-08T23:44:18.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj595 testdata.475827 2026-03-08T23:44:18.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj596 testdata.475827 2026-03-08T23:44:18.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj597 testdata.475827 2026-03-08T23:44:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj598 testdata.475827 2026-03-08T23:44:18.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj599 testdata.475827 2026-03-08T23:44:18.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj600 testdata.475827 2026-03-08T23:44:18.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj601 testdata.475827 2026-03-08T23:44:18.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj602 testdata.475827 2026-03-08T23:44:18.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj603 testdata.475827 2026-03-08T23:44:18.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj604 testdata.475827 2026-03-08T23:44:18.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj605 testdata.475827 2026-03-08T23:44:18.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj606 testdata.475827 2026-03-08T23:44:18.801 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj607 testdata.475827 2026-03-08T23:44:18.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj608 testdata.475827 2026-03-08T23:44:18.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj609 testdata.475827 2026-03-08T23:44:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj610 testdata.475827 2026-03-08T23:44:18.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj611 testdata.475827 2026-03-08T23:44:18.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj612 testdata.475827 2026-03-08T23:44:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:18.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj613 testdata.475827 2026-03-08T23:44:19.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj614 testdata.475827 2026-03-08T23:44:19.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj615 testdata.475827 2026-03-08T23:44:19.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj616 testdata.475827 2026-03-08T23:44:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj617 testdata.475827 2026-03-08T23:44:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj618 testdata.475827 2026-03-08T23:44:19.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.119 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj619 testdata.475827 2026-03-08T23:44:19.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj620 testdata.475827 2026-03-08T23:44:19.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj621 testdata.475827 2026-03-08T23:44:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj622 testdata.475827 2026-03-08T23:44:19.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.210 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj623 testdata.475827 2026-03-08T23:44:19.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj624 testdata.475827 2026-03-08T23:44:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj625 testdata.475827 2026-03-08T23:44:19.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj626 testdata.475827 2026-03-08T23:44:19.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj627 testdata.475827 2026-03-08T23:44:19.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj628 testdata.475827 2026-03-08T23:44:19.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj629 testdata.475827 2026-03-08T23:44:19.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj630 testdata.475827 2026-03-08T23:44:19.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj631 testdata.475827 2026-03-08T23:44:19.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj632 testdata.475827 2026-03-08T23:44:19.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj633 testdata.475827 2026-03-08T23:44:19.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj634 testdata.475827 2026-03-08T23:44:19.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.470 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj635 testdata.475827 2026-03-08T23:44:19.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj636 testdata.475827 2026-03-08T23:44:19.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj637 testdata.475827 2026-03-08T23:44:19.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj638 testdata.475827 2026-03-08T23:44:19.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj639 testdata.475827 2026-03-08T23:44:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj640 testdata.475827 2026-03-08T23:44:19.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj641 testdata.475827 2026-03-08T23:44:19.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj642 testdata.475827 2026-03-08T23:44:19.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj643 testdata.475827 2026-03-08T23:44:19.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj644 testdata.475827 2026-03-08T23:44:19.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj645 testdata.475827 2026-03-08T23:44:19.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj646 testdata.475827 2026-03-08T23:44:19.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj647 testdata.475827 2026-03-08T23:44:19.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj648 testdata.475827 2026-03-08T23:44:19.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.804 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj649 testdata.475827 2026-03-08T23:44:19.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj650 testdata.475827 2026-03-08T23:44:19.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj651 testdata.475827 2026-03-08T23:44:19.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj652 testdata.475827 2026-03-08T23:44:19.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj653 testdata.475827 2026-03-08T23:44:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj654 testdata.475827 2026-03-08T23:44:19.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj655 testdata.475827 2026-03-08T23:44:19.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj656 testdata.475827 2026-03-08T23:44:19.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:19.993 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj657 testdata.475827 2026-03-08T23:44:20.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj658 testdata.475827 2026-03-08T23:44:20.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj659 testdata.475827 2026-03-08T23:44:20.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj660 testdata.475827 2026-03-08T23:44:20.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj661 testdata.475827 2026-03-08T23:44:20.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj662 testdata.475827 2026-03-08T23:44:20.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj663 testdata.475827 2026-03-08T23:44:20.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj664 testdata.475827 2026-03-08T23:44:20.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj665 testdata.475827 2026-03-08T23:44:20.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj666 testdata.475827 2026-03-08T23:44:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj667 testdata.475827 2026-03-08T23:44:20.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj668 testdata.475827 2026-03-08T23:44:20.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.282 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj669 testdata.475827 2026-03-08T23:44:20.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.306 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj670 testdata.475827 2026-03-08T23:44:20.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj671 testdata.475827 2026-03-08T23:44:20.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj672 testdata.475827 2026-03-08T23:44:20.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj673 testdata.475827 2026-03-08T23:44:20.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.401 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj674 testdata.475827 2026-03-08T23:44:20.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj675 testdata.475827 2026-03-08T23:44:20.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj676 testdata.475827 2026-03-08T23:44:20.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj677 testdata.475827 2026-03-08T23:44:20.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj678 testdata.475827 2026-03-08T23:44:20.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj679 testdata.475827 2026-03-08T23:44:20.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj680 testdata.475827 2026-03-08T23:44:20.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj681 testdata.475827 2026-03-08T23:44:20.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj682 testdata.475827 2026-03-08T23:44:20.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj683 testdata.475827 2026-03-08T23:44:20.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj684 testdata.475827 2026-03-08T23:44:20.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj685 testdata.475827 2026-03-08T23:44:20.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj686 testdata.475827 2026-03-08T23:44:20.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj687 testdata.475827 2026-03-08T23:44:20.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj688 testdata.475827 2026-03-08T23:44:20.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj689 testdata.475827 2026-03-08T23:44:20.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj690 testdata.475827 2026-03-08T23:44:20.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj691 testdata.475827 2026-03-08T23:44:20.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.812 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj692 testdata.475827 2026-03-08T23:44:20.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj693 testdata.475827 2026-03-08T23:44:20.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj694 testdata.475827 2026-03-08T23:44:20.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj695 testdata.475827 2026-03-08T23:44:20.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj696 testdata.475827 2026-03-08T23:44:20.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj697 testdata.475827 2026-03-08T23:44:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj698 testdata.475827 2026-03-08T23:44:20.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj699 testdata.475827 2026-03-08T23:44:20.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:20.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj700 testdata.475827 2026-03-08T23:44:21.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj701 testdata.475827 2026-03-08T23:44:21.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.044 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj702 testdata.475827 2026-03-08T23:44:21.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj703 testdata.475827 2026-03-08T23:44:21.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj704 testdata.475827 2026-03-08T23:44:21.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj705 testdata.475827 2026-03-08T23:44:21.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj706 testdata.475827 2026-03-08T23:44:21.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj707 testdata.475827 2026-03-08T23:44:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.184 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj708 testdata.475827 2026-03-08T23:44:21.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.211 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj709 testdata.475827 2026-03-08T23:44:21.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj710 testdata.475827 2026-03-08T23:44:21.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj711 testdata.475827 2026-03-08T23:44:21.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj712 testdata.475827 2026-03-08T23:44:21.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj713 testdata.475827 2026-03-08T23:44:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj714 testdata.475827 2026-03-08T23:44:21.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.353 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj715 testdata.475827 2026-03-08T23:44:21.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj716 testdata.475827 2026-03-08T23:44:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj717 testdata.475827 2026-03-08T23:44:21.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj718 testdata.475827 2026-03-08T23:44:21.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj719 testdata.475827 2026-03-08T23:44:21.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj720 testdata.475827 2026-03-08T23:44:21.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj721 testdata.475827 2026-03-08T23:44:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj722 testdata.475827 2026-03-08T23:44:21.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj723 testdata.475827 2026-03-08T23:44:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj724 testdata.475827 2026-03-08T23:44:21.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj725 testdata.475827 2026-03-08T23:44:21.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj726 testdata.475827 2026-03-08T23:44:21.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.631 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj727 testdata.475827 2026-03-08T23:44:21.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj728 testdata.475827 2026-03-08T23:44:21.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj729 testdata.475827 2026-03-08T23:44:21.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.709 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj730 testdata.475827 2026-03-08T23:44:21.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj731 testdata.475827 2026-03-08T23:44:21.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj732 testdata.475827 2026-03-08T23:44:21.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj733 testdata.475827 2026-03-08T23:44:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj734 testdata.475827 2026-03-08T23:44:21.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj735 testdata.475827 2026-03-08T23:44:21.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj736 testdata.475827 2026-03-08T23:44:21.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj737 testdata.475827 2026-03-08T23:44:21.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj738 testdata.475827 2026-03-08T23:44:21.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj739 testdata.475827 2026-03-08T23:44:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj740 testdata.475827 2026-03-08T23:44:21.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:21.984 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj741 testdata.475827 2026-03-08T23:44:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj742 testdata.475827 2026-03-08T23:44:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj743 testdata.475827 2026-03-08T23:44:22.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj744 testdata.475827 2026-03-08T23:44:22.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj745 testdata.475827 2026-03-08T23:44:22.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj746 testdata.475827 2026-03-08T23:44:22.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj747 testdata.475827 2026-03-08T23:44:22.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj748 testdata.475827 2026-03-08T23:44:22.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj749 testdata.475827 2026-03-08T23:44:22.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj750 testdata.475827 2026-03-08T23:44:22.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj751 testdata.475827 2026-03-08T23:44:22.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj752 testdata.475827 2026-03-08T23:44:22.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj753 testdata.475827 2026-03-08T23:44:22.272 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj754 testdata.475827 2026-03-08T23:44:22.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj755 testdata.475827 2026-03-08T23:44:22.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj756 testdata.475827 2026-03-08T23:44:22.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj757 testdata.475827 2026-03-08T23:44:22.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj758 testdata.475827 2026-03-08T23:44:22.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj759 testdata.475827 2026-03-08T23:44:22.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj760 testdata.475827 2026-03-08T23:44:22.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj761 testdata.475827 2026-03-08T23:44:22.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj762 testdata.475827 2026-03-08T23:44:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj763 testdata.475827 2026-03-08T23:44:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj764 testdata.475827 2026-03-08T23:44:22.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.523 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj765 testdata.475827 2026-03-08T23:44:22.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj766 testdata.475827 2026-03-08T23:44:22.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj767 testdata.475827 2026-03-08T23:44:22.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj768 testdata.475827 2026-03-08T23:44:22.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj769 testdata.475827 2026-03-08T23:44:22.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj770 testdata.475827 2026-03-08T23:44:22.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj771 testdata.475827 2026-03-08T23:44:22.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj772 testdata.475827 2026-03-08T23:44:22.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj773 testdata.475827 2026-03-08T23:44:22.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj774 testdata.475827 2026-03-08T23:44:22.760 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.761 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj775 testdata.475827 2026-03-08T23:44:22.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj776 testdata.475827 2026-03-08T23:44:22.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj777 testdata.475827 2026-03-08T23:44:22.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj778 testdata.475827 2026-03-08T23:44:22.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.847 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj779 testdata.475827 2026-03-08T23:44:22.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj780 testdata.475827 2026-03-08T23:44:22.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj781 testdata.475827 2026-03-08T23:44:22.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj782 testdata.475827 2026-03-08T23:44:22.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj783 testdata.475827 2026-03-08T23:44:22.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj784 testdata.475827 2026-03-08T23:44:22.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:22.983 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj785 testdata.475827 2026-03-08T23:44:23.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj786 testdata.475827 2026-03-08T23:44:23.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj787 testdata.475827 2026-03-08T23:44:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj788 testdata.475827 2026-03-08T23:44:23.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj789 testdata.475827 2026-03-08T23:44:23.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj790 testdata.475827 2026-03-08T23:44:23.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj791 testdata.475827 2026-03-08T23:44:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj792 testdata.475827 2026-03-08T23:44:23.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj793 testdata.475827 2026-03-08T23:44:23.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj794 testdata.475827 2026-03-08T23:44:23.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj795 testdata.475827 2026-03-08T23:44:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj796 testdata.475827 2026-03-08T23:44:23.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj797 testdata.475827 2026-03-08T23:44:23.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.297 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj798 testdata.475827 2026-03-08T23:44:23.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj799 testdata.475827 2026-03-08T23:44:23.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj800 testdata.475827 2026-03-08T23:44:23.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj801 testdata.475827 2026-03-08T23:44:23.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj802 testdata.475827 2026-03-08T23:44:23.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.418 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj803 testdata.475827 2026-03-08T23:44:23.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj804 testdata.475827 2026-03-08T23:44:23.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.464 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj805 testdata.475827 2026-03-08T23:44:23.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj806 testdata.475827 2026-03-08T23:44:23.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj807 testdata.475827 2026-03-08T23:44:23.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj808 testdata.475827 2026-03-08T23:44:23.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj809 testdata.475827 2026-03-08T23:44:23.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj810 testdata.475827 2026-03-08T23:44:23.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj811 testdata.475827 2026-03-08T23:44:23.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj812 testdata.475827 2026-03-08T23:44:23.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj813 testdata.475827 2026-03-08T23:44:23.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj814 testdata.475827 2026-03-08T23:44:23.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj815 testdata.475827 2026-03-08T23:44:23.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.726 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj816 testdata.475827 2026-03-08T23:44:23.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj817 testdata.475827 2026-03-08T23:44:23.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj818 testdata.475827 2026-03-08T23:44:23.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj819 testdata.475827 2026-03-08T23:44:23.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj820 testdata.475827 2026-03-08T23:44:23.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj821 testdata.475827 2026-03-08T23:44:23.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj822 testdata.475827 2026-03-08T23:44:23.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj823 testdata.475827 2026-03-08T23:44:23.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj824 testdata.475827 2026-03-08T23:44:23.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj825 testdata.475827 2026-03-08T23:44:23.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:23.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj826 testdata.475827 2026-03-08T23:44:24.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj827 testdata.475827 2026-03-08T23:44:24.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj828 testdata.475827 2026-03-08T23:44:24.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj829 testdata.475827 2026-03-08T23:44:24.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj830 testdata.475827 2026-03-08T23:44:24.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj831 testdata.475827 2026-03-08T23:44:24.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj832 testdata.475827 2026-03-08T23:44:24.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj833 testdata.475827 2026-03-08T23:44:24.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj834 testdata.475827 2026-03-08T23:44:24.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.212 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj835 testdata.475827 2026-03-08T23:44:24.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj836 testdata.475827 2026-03-08T23:44:24.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj837 testdata.475827 2026-03-08T23:44:24.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj838 testdata.475827 2026-03-08T23:44:24.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj839 testdata.475827 2026-03-08T23:44:24.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj840 testdata.475827 2026-03-08T23:44:24.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj841 testdata.475827 2026-03-08T23:44:24.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj842 testdata.475827 2026-03-08T23:44:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj843 testdata.475827 2026-03-08T23:44:24.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.442 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj844 testdata.475827 2026-03-08T23:44:24.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj845 testdata.475827 2026-03-08T23:44:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj846 testdata.475827 2026-03-08T23:44:24.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj847 testdata.475827 2026-03-08T23:44:24.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj848 testdata.475827 2026-03-08T23:44:24.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj849 testdata.475827 2026-03-08T23:44:24.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj850 testdata.475827 2026-03-08T23:44:24.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj851 testdata.475827 2026-03-08T23:44:24.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj852 testdata.475827 2026-03-08T23:44:24.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj853 testdata.475827 2026-03-08T23:44:24.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj854 testdata.475827 2026-03-08T23:44:24.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj855 testdata.475827 2026-03-08T23:44:24.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj856 testdata.475827 2026-03-08T23:44:24.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj857 testdata.475827 2026-03-08T23:44:24.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.810 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj858 testdata.475827 2026-03-08T23:44:24.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj859 testdata.475827 2026-03-08T23:44:24.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.863 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj860 testdata.475827 2026-03-08T23:44:24.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj861 testdata.475827 2026-03-08T23:44:24.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj862 testdata.475827 2026-03-08T23:44:24.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj863 testdata.475827 2026-03-08T23:44:24.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj864 testdata.475827 2026-03-08T23:44:24.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj865 testdata.475827 2026-03-08T23:44:24.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:24.996 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj866 testdata.475827 2026-03-08T23:44:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj867 testdata.475827 2026-03-08T23:44:25.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj868 testdata.475827 2026-03-08T23:44:25.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj869 testdata.475827 2026-03-08T23:44:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj870 testdata.475827 2026-03-08T23:44:25.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.113 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj871 testdata.475827 2026-03-08T23:44:25.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj872 testdata.475827 2026-03-08T23:44:25.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj873 testdata.475827 2026-03-08T23:44:25.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj874 testdata.475827 2026-03-08T23:44:25.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj875 testdata.475827 2026-03-08T23:44:25.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj876 testdata.475827 2026-03-08T23:44:25.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj877 testdata.475827 2026-03-08T23:44:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.277 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj878 testdata.475827 2026-03-08T23:44:25.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj879 testdata.475827 2026-03-08T23:44:25.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj880 testdata.475827 2026-03-08T23:44:25.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj881 testdata.475827 2026-03-08T23:44:25.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj882 testdata.475827 2026-03-08T23:44:25.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj883 testdata.475827 2026-03-08T23:44:25.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj884 testdata.475827 2026-03-08T23:44:25.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj885 testdata.475827 2026-03-08T23:44:25.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj886 testdata.475827 2026-03-08T23:44:25.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj887 testdata.475827 2026-03-08T23:44:25.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:25.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj888 testdata.475827 2026-03-08T23:44:26.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj889 testdata.475827 2026-03-08T23:44:26.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj890 testdata.475827 2026-03-08T23:44:26.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj891 testdata.475827 2026-03-08T23:44:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.214 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj892 testdata.475827 2026-03-08T23:44:26.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj893 testdata.475827 2026-03-08T23:44:26.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj894 testdata.475827 2026-03-08T23:44:26.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj895 testdata.475827 2026-03-08T23:44:26.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.293 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj896 testdata.475827 2026-03-08T23:44:26.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj897 testdata.475827 2026-03-08T23:44:26.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj898 testdata.475827 2026-03-08T23:44:26.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj899 testdata.475827 2026-03-08T23:44:26.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj900 testdata.475827 2026-03-08T23:44:26.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj901 testdata.475827 2026-03-08T23:44:26.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj902 testdata.475827 2026-03-08T23:44:26.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj903 testdata.475827 2026-03-08T23:44:26.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj904 testdata.475827 2026-03-08T23:44:26.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj905 testdata.475827 2026-03-08T23:44:26.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj906 testdata.475827 2026-03-08T23:44:26.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.522 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj907 testdata.475827 2026-03-08T23:44:26.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj908 testdata.475827 2026-03-08T23:44:26.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj909 testdata.475827 2026-03-08T23:44:26.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj910 testdata.475827 2026-03-08T23:44:26.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj911 testdata.475827 2026-03-08T23:44:26.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj912 testdata.475827 2026-03-08T23:44:26.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj913 testdata.475827 2026-03-08T23:44:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj914 testdata.475827 2026-03-08T23:44:26.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj915 testdata.475827 2026-03-08T23:44:26.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:26.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj916 testdata.475827 2026-03-08T23:44:27.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.049 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj917 testdata.475827 2026-03-08T23:44:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj918 testdata.475827 2026-03-08T23:44:27.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj919 testdata.475827 2026-03-08T23:44:27.115 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.116 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj920 testdata.475827 2026-03-08T23:44:27.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj921 testdata.475827 2026-03-08T23:44:27.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj922 testdata.475827 2026-03-08T23:44:27.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj923 testdata.475827 2026-03-08T23:44:27.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj924 testdata.475827 2026-03-08T23:44:27.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj925 testdata.475827 2026-03-08T23:44:27.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj926 testdata.475827 2026-03-08T23:44:27.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj927 testdata.475827 2026-03-08T23:44:27.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj928 testdata.475827 2026-03-08T23:44:27.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj929 testdata.475827 2026-03-08T23:44:27.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj930 testdata.475827 2026-03-08T23:44:27.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj931 testdata.475827 2026-03-08T23:44:27.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj932 testdata.475827 2026-03-08T23:44:27.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj933 testdata.475827 2026-03-08T23:44:27.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj934 testdata.475827 2026-03-08T23:44:27.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj935 testdata.475827 2026-03-08T23:44:27.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj936 testdata.475827 2026-03-08T23:44:27.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj937 testdata.475827 2026-03-08T23:44:27.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj938 testdata.475827 2026-03-08T23:44:27.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj939 testdata.475827 2026-03-08T23:44:27.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj940 testdata.475827 2026-03-08T23:44:27.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj941 testdata.475827 2026-03-08T23:44:27.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj942 testdata.475827 2026-03-08T23:44:27.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj943 testdata.475827 2026-03-08T23:44:27.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj944 testdata.475827 2026-03-08T23:44:27.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.697 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj945 testdata.475827 2026-03-08T23:44:27.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj946 testdata.475827 2026-03-08T23:44:27.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj947 testdata.475827 2026-03-08T23:44:27.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj948 testdata.475827 2026-03-08T23:44:27.794 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj949 testdata.475827 2026-03-08T23:44:27.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj950 testdata.475827 2026-03-08T23:44:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj951 testdata.475827 2026-03-08T23:44:27.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj952 testdata.475827 2026-03-08T23:44:27.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj953 testdata.475827 2026-03-08T23:44:27.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj954 testdata.475827 2026-03-08T23:44:27.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj955 testdata.475827 2026-03-08T23:44:27.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj956 testdata.475827 2026-03-08T23:44:27.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:27.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj957 testdata.475827 2026-03-08T23:44:28.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj958 testdata.475827 2026-03-08T23:44:28.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj959 testdata.475827 2026-03-08T23:44:28.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj960 testdata.475827 2026-03-08T23:44:28.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj961 testdata.475827 2026-03-08T23:44:28.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.097 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj962 testdata.475827 2026-03-08T23:44:28.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.121 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj963 testdata.475827 2026-03-08T23:44:28.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj964 testdata.475827 2026-03-08T23:44:28.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj965 testdata.475827 2026-03-08T23:44:28.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj966 testdata.475827 2026-03-08T23:44:28.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj967 testdata.475827 2026-03-08T23:44:28.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj968 testdata.475827 2026-03-08T23:44:28.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj969 testdata.475827 2026-03-08T23:44:28.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj970 testdata.475827 2026-03-08T23:44:28.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.696 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj971 testdata.475827 2026-03-08T23:44:28.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj972 testdata.475827 2026-03-08T23:44:28.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj973 testdata.475827 2026-03-08T23:44:28.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj974 testdata.475827 2026-03-08T23:44:28.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj975 testdata.475827 2026-03-08T23:44:28.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:28.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj976 testdata.475827 2026-03-08T23:44:29.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj977 testdata.475827 2026-03-08T23:44:29.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj978 testdata.475827 2026-03-08T23:44:29.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj979 testdata.475827 2026-03-08T23:44:29.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj980 testdata.475827 2026-03-08T23:44:29.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj981 testdata.475827 2026-03-08T23:44:29.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj982 testdata.475827 2026-03-08T23:44:29.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj983 testdata.475827 2026-03-08T23:44:29.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj984 testdata.475827 2026-03-08T23:44:29.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj985 testdata.475827 2026-03-08T23:44:29.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj986 testdata.475827 2026-03-08T23:44:29.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj987 testdata.475827 2026-03-08T23:44:29.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj988 testdata.475827 2026-03-08T23:44:29.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj989 testdata.475827 2026-03-08T23:44:29.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj990 testdata.475827 2026-03-08T23:44:29.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj991 testdata.475827 2026-03-08T23:44:29.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj992 testdata.475827 2026-03-08T23:44:29.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj993 testdata.475827 2026-03-08T23:44:29.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj994 testdata.475827 2026-03-08T23:44:29.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj995 testdata.475827 2026-03-08T23:44:29.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj996 testdata.475827 2026-03-08T23:44:29.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj997 testdata.475827 2026-03-08T23:44:29.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj998 testdata.475827 2026-03-08T23:44:29.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj999 testdata.475827 2026-03-08T23:44:29.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:335: _scrub_abort: for i in `seq 1 $objects` 2026-03-08T23:44:29.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:337: _scrub_abort: rados -p test put obj1000 testdata.475827 2026-03-08T23:44:29.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:339: _scrub_abort: rm -f testdata.475827 2026-03-08T23:44:29.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:341: _scrub_abort: get_primary test obj1 2026-03-08T23:44:29.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:44:29.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:44:29.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:44:29.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:44:29.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:341: _scrub_abort: local primary=1 2026-03-08T23:44:29.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:342: _scrub_abort: local pgid=1.0 2026-03-08T23:44:29.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:344: _scrub_abort: ceph tell 1.0 schedule-scrub 2026-03-08T23:44:29.941 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:44:29.941 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:44:29.941 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:44:29.941 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:42:49.952081+0000" 2026-03-08T23:44:29.941 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:44:29.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:347: _scrub_abort: set -o pipefail 2026-03-08T23:44:29.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:348: _scrub_abort: found=no 2026-03-08T23:44:29.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:349: _scrub_abort: seq 0 200 2026-03-08T23:44:29.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:349: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:29.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:351: _scrub_abort: flush_pg_stats 2026-03-08T23:44:29.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:29.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:30.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:30.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836524 2026-03-08T23:44:30.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836524 2026-03-08T23:44:30.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524' 2026-03-08T23:44:30.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:30.215 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:30.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672999 2026-03-08T23:44:30.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672999 2026-03-08T23:44:30.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949672999' 2026-03-08T23:44:30.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:30.300 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542177 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542177 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836524 1-42949672999 2-60129542177' 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836524 2026-03-08T23:44:30.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:30.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:30.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836524 2026-03-08T23:44:30.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:30.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836524 2026-03-08T23:44:30.383 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836524 2026-03-08T23:44:30.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836524' 2026-03-08T23:44:30.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:30.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836522 -lt 21474836524 2026-03-08T23:44:30.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:31.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:31.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:31.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836522 -lt 21474836524 2026-03-08T23:44:31.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:32.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:44:32.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:32.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836525 -lt 21474836524 2026-03-08T23:44:32.928 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:32.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672999 2026-03-08T23:44:32.928 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:32.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:32.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672999 2026-03-08T23:44:32.929 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:32.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672999 2026-03-08T23:44:32.930 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672999 2026-03-08T23:44:32.931 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672999' 2026-03-08T23:44:32.931 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:33.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673001 -lt 42949672999 2026-03-08T23:44:33.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:33.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542177 2026-03-08T23:44:33.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:33.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:33.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542177 2026-03-08T23:44:33.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:33.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542177 2026-03-08T23:44:33.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542177' 2026-03-08T23:44:33.233 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542177 2026-03-08T23:44:33.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:33.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542178 -lt 60129542177 2026-03-08T23:44:33.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:33.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:33.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:352: _scrub_abort: grep '^1.0' 2026-03-08T23:44:33.572 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:33.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:354: _scrub_abort: found=yes 2026-03-08T23:44:33.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:356: _scrub_abort: break 2026-03-08T23:44:33.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:359: _scrub_abort: set +o pipefail 2026-03-08T23:44:33.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:361: _scrub_abort: test yes = no 2026-03-08T23:44:33.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:367: _scrub_abort: ceph osd set noscrub 2026-03-08T23:44:33.820 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is set 2026-03-08T23:44:33.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:368: _scrub_abort: '[' scrub = deep-scrub ']' 2026-03-08T23:44:33.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:374: _scrub_abort: set -o pipefail 2026-03-08T23:44:33.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: seq 0 200 2026-03-08T23:44:33.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:33.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:33.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:33.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:34.012 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:34.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836529 2026-03-08T23:44:34.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836529 2026-03-08T23:44:34.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529' 2026-03-08T23:44:34.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:34.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:34.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673004 2026-03-08T23:44:34.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673004 2026-03-08T23:44:34.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673004' 2026-03-08T23:44:34.180 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:34.180 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542182 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542182 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673004 2-60129542182' 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836529 2026-03-08T23:44:34.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:34.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:34.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836529 2026-03-08T23:44:34.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:34.264 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836529 2026-03-08T23:44:34.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836529 2026-03-08T23:44:34.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836529' 2026-03-08T23:44:34.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:34.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836527 -lt 21474836529 2026-03-08T23:44:34.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:35.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:35.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:35.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836527 -lt 21474836529 2026-03-08T23:44:35.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:36.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:44:36.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836530 -lt 21474836529 2026-03-08T23:44:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:36.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673004 2026-03-08T23:44:36.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:36.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:36.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673004 2026-03-08T23:44:36.790 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:36.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673004 2026-03-08T23:44:36.791 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673004 2026-03-08T23:44:36.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673004' 2026-03-08T23:44:36.791 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:36.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673006 -lt 42949673004 2026-03-08T23:44:36.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:36.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542182 2026-03-08T23:44:36.962 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:36.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:36.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542182 2026-03-08T23:44:36.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:36.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542182 2026-03-08T23:44:36.964 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542182 2026-03-08T23:44:36.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542182' 2026-03-08T23:44:36.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:37.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542183 -lt 60129542182 2026-03-08T23:44:37.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:37.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:37.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:37.296 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:37.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:37.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:37.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:37.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:37.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:37.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:37.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836533 2026-03-08T23:44:37.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836533 2026-03-08T23:44:37.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533' 2026-03-08T23:44:37.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:37.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:37.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673008 2026-03-08T23:44:37.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673008 2026-03-08T23:44:37.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533 1-42949673008' 2026-03-08T23:44:37.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:37.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:37.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542186 2026-03-08T23:44:37.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542186 2026-03-08T23:44:37.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836533 1-42949673008 2-60129542186' 2026-03-08T23:44:37.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:37.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836533 2026-03-08T23:44:37.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:37.721 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:37.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836533 2026-03-08T23:44:37.721 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:37.722 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836533 2026-03-08T23:44:37.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836533 2026-03-08T23:44:37.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836533' 2026-03-08T23:44:37.722 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:37.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836533 -lt 21474836533 2026-03-08T23:44:37.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:37.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:37.894 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673008 2026-03-08T23:44:37.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:37.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673008 2026-03-08T23:44:37.895 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:37.896 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673008 2026-03-08T23:44:37.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673008 2026-03-08T23:44:37.896 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673008' 2026-03-08T23:44:37.896 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:38.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673009 -lt 42949673008 2026-03-08T23:44:38.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:38.081 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542186 2026-03-08T23:44:38.082 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:38.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:38.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542186 2026-03-08T23:44:38.084 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:38.085 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542186 2026-03-08T23:44:38.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542186 2026-03-08T23:44:38.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542186' 2026-03-08T23:44:38.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:38.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542186 -lt 60129542186 2026-03-08T23:44:38.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:38.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:38.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:38.449 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:38.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:38.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:38.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:38.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:38.465 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:38.643 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836535 2026-03-08T23:44:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836535 2026-03-08T23:44:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535' 2026-03-08T23:44:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:38.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:38.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673011 2026-03-08T23:44:38.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673011 2026-03-08T23:44:38.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535 1-42949673011' 2026-03-08T23:44:38.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:38.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542188 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542188 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535 1-42949673011 2-60129542188' 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836535 2026-03-08T23:44:38.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:38.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:38.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836535 2026-03-08T23:44:38.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:38.948 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836535 2026-03-08T23:44:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836535 2026-03-08T23:44:38.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836535' 2026-03-08T23:44:38.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:39.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836533 -lt 21474836535 2026-03-08T23:44:39.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:40.127 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:40.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836536 -lt 21474836535 2026-03-08T23:44:40.291 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:40.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673011 2026-03-08T23:44:40.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:40.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:40.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673011 2026-03-08T23:44:40.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:40.294 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673011 2026-03-08T23:44:40.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673011 2026-03-08T23:44:40.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673011' 2026-03-08T23:44:40.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:40.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673012 -lt 42949673011 2026-03-08T23:44:40.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:40.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542188 2026-03-08T23:44:40.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:40.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:40.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542188 2026-03-08T23:44:40.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:40.477 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542188 2026-03-08T23:44:40.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542188 2026-03-08T23:44:40.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542188' 2026-03-08T23:44:40.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:40.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542189 -lt 60129542188 2026-03-08T23:44:40.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:40.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:40.656 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:40.812 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:40.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:40.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:40.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:40.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:40.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:40.998 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:40.998 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:40.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:41.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836539 2026-03-08T23:44:41.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836539 2026-03-08T23:44:41.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539' 2026-03-08T23:44:41.078 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:41.078 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:41.157 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673014 2026-03-08T23:44:41.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673014 2026-03-08T23:44:41.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539 1-42949673014' 2026-03-08T23:44:41.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:41.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542192 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542192 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836539 1-42949673014 2-60129542192' 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836539 2026-03-08T23:44:41.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:41.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:41.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836539 2026-03-08T23:44:41.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:41.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836539 2026-03-08T23:44:41.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836539' 2026-03-08T23:44:41.244 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836539 2026-03-08T23:44:41.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:41.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836536 -lt 21474836539 2026-03-08T23:44:41.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:42.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:42.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:42.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836539 -lt 21474836539 2026-03-08T23:44:42.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:42.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673014 2026-03-08T23:44:42.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:42.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:42.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673014 2026-03-08T23:44:42.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:42.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673014 2026-03-08T23:44:42.607 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673014 2026-03-08T23:44:42.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673014' 2026-03-08T23:44:42.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:42.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673015 -lt 42949673014 2026-03-08T23:44:42.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:42.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542192 2026-03-08T23:44:42.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:42.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:42.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:42.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542192 2026-03-08T23:44:42.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542192 2026-03-08T23:44:42.785 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542192 2026-03-08T23:44:42.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542192' 2026-03-08T23:44:42.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:42.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542192 -lt 60129542192 2026-03-08T23:44:42.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:42.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:42.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:43.118 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:43.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:43.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:43.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:43.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:43.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:43.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:43.314 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:43.314 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:43.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:43.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:43.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:43.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836542 2026-03-08T23:44:43.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836542 2026-03-08T23:44:43.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836542' 2026-03-08T23:44:43.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:43.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:43.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673017 2026-03-08T23:44:43.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673017 2026-03-08T23:44:43.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836542 1-42949673017' 2026-03-08T23:44:43.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:43.485 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542195 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542195 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836542 1-42949673017 2-60129542195' 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836542 2026-03-08T23:44:43.567 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:43.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:43.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836542 2026-03-08T23:44:43.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:43.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836542 2026-03-08T23:44:43.570 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836542 2026-03-08T23:44:43.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836542' 2026-03-08T23:44:43.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:43.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836539 -lt 21474836542 2026-03-08T23:44:43.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:44.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:44.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:44.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836542 -lt 21474836542 2026-03-08T23:44:44.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:44.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673017 2026-03-08T23:44:44.918 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:44.919 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:44.919 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673017 2026-03-08T23:44:44.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:44.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673017 2026-03-08T23:44:44.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673017' 2026-03-08T23:44:44.920 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673017 2026-03-08T23:44:44.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:45.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673018 -lt 42949673017 2026-03-08T23:44:45.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:45.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542195 2026-03-08T23:44:45.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:45.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:45.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:45.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542195 2026-03-08T23:44:45.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542195 2026-03-08T23:44:45.259 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542195 2026-03-08T23:44:45.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542195' 2026-03-08T23:44:45.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:45.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542195 -lt 60129542195 2026-03-08T23:44:45.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:45.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:45.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:45.745 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:45.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:45.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:45.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:45.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:45.759 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:45.937 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:46.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836546 2026-03-08T23:44:46.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836546 2026-03-08T23:44:46.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836546' 2026-03-08T23:44:46.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:46.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:46.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673021 2026-03-08T23:44:46.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673021 2026-03-08T23:44:46.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836546 1-42949673021' 2026-03-08T23:44:46.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:46.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542199 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542199 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836546 1-42949673021 2-60129542199' 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836546 2026-03-08T23:44:46.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:46.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:46.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836546 2026-03-08T23:44:46.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836546 2026-03-08T23:44:46.231 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836546 2026-03-08T23:44:46.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836546' 2026-03-08T23:44:46.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:46.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836544 -lt 21474836546 2026-03-08T23:44:46.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:47.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:47.397 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:47.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836544 -lt 21474836546 2026-03-08T23:44:47.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:48.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:44:48.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:48.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836547 -lt 21474836546 2026-03-08T23:44:48.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:48.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673021 2026-03-08T23:44:48.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:48.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:48.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673021 2026-03-08T23:44:48.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:48.748 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673021 2026-03-08T23:44:48.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673021 2026-03-08T23:44:48.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673021' 2026-03-08T23:44:48.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:48.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673023 -lt 42949673021 2026-03-08T23:44:48.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:48.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542199 2026-03-08T23:44:48.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:48.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:48.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542199 2026-03-08T23:44:48.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:48.923 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542199 2026-03-08T23:44:48.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542199 2026-03-08T23:44:48.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542199' 2026-03-08T23:44:48.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:49.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542200 -lt 60129542199 2026-03-08T23:44:49.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:49.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:49.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:49.257 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:49.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:49.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:49.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:49.271 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:49.272 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:49.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:49.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836550 2026-03-08T23:44:49.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836550 2026-03-08T23:44:49.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550' 2026-03-08T23:44:49.535 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:49.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:49.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673025 2026-03-08T23:44:49.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673025 2026-03-08T23:44:49.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550 1-42949673025' 2026-03-08T23:44:49.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:49.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542203 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542203 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836550 1-42949673025 2-60129542203' 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836550 2026-03-08T23:44:49.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:49.708 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:49.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836550 2026-03-08T23:44:49.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:49.710 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836550 2026-03-08T23:44:49.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836550 2026-03-08T23:44:49.710 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836550' 2026-03-08T23:44:49.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:49.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836550 -lt 21474836550 2026-03-08T23:44:49.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:49.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673025 2026-03-08T23:44:49.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:49.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:49.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673025 2026-03-08T23:44:49.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:49.889 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673025 2026-03-08T23:44:49.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673025 2026-03-08T23:44:49.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673025' 2026-03-08T23:44:49.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:50.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673026 -lt 42949673025 2026-03-08T23:44:50.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:50.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542203 2026-03-08T23:44:50.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:50.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542203 2026-03-08T23:44:50.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:50.071 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542203 2026-03-08T23:44:50.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542203 2026-03-08T23:44:50.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542203' 2026-03-08T23:44:50.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:50.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542203 -lt 60129542203 2026-03-08T23:44:50.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:50.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:50.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:50.418 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:50.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:50.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:50.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:50.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:50.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:50.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:50.603 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:50.603 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:50.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:50.603 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:50.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836552 2026-03-08T23:44:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836552 2026-03-08T23:44:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836552' 2026-03-08T23:44:50.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:50.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:50.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673028 2026-03-08T23:44:50.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673028 2026-03-08T23:44:50.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836552 1-42949673028' 2026-03-08T23:44:50.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:50.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:50.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542205 2026-03-08T23:44:50.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542205 2026-03-08T23:44:50.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836552 1-42949673028 2-60129542205' 2026-03-08T23:44:50.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:50.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836552 2026-03-08T23:44:50.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:50.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:50.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836552 2026-03-08T23:44:50.837 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:50.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836552 2026-03-08T23:44:50.838 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836552 2026-03-08T23:44:50.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836552' 2026-03-08T23:44:50.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:51.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836550 -lt 21474836552 2026-03-08T23:44:51.011 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:52.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:52.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:52.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836553 -lt 21474836552 2026-03-08T23:44:52.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:52.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673028 2026-03-08T23:44:52.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:52.260 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:52.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673028 2026-03-08T23:44:52.261 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:52.262 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673028 2026-03-08T23:44:52.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673028 2026-03-08T23:44:52.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673028' 2026-03-08T23:44:52.262 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:52.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673029 -lt 42949673028 2026-03-08T23:44:52.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:52.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542205 2026-03-08T23:44:52.455 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:52.456 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:52.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542205 2026-03-08T23:44:52.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:52.458 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542205 2026-03-08T23:44:52.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542205 2026-03-08T23:44:52.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542205' 2026-03-08T23:44:52.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:52.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542206 -lt 60129542205 2026-03-08T23:44:52.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:52.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:52.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:52.790 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:52.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:52.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:52.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:52.806 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:52.806 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:52.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:52.973 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:52.973 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:52.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:52.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:52.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:53.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836556 2026-03-08T23:44:53.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836556 2026-03-08T23:44:53.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836556' 2026-03-08T23:44:53.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:53.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:53.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673031 2026-03-08T23:44:53.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673031 2026-03-08T23:44:53.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836556 1-42949673031' 2026-03-08T23:44:53.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:53.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542209 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542209 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836556 1-42949673031 2-60129542209' 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836556 2026-03-08T23:44:53.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:53.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:53.218 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836556 2026-03-08T23:44:53.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:53.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836556 2026-03-08T23:44:53.220 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836556 2026-03-08T23:44:53.220 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836556' 2026-03-08T23:44:53.220 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:53.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836553 -lt 21474836556 2026-03-08T23:44:53.389 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:54.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:54.391 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:54.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836556 -lt 21474836556 2026-03-08T23:44:54.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:54.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673031 2026-03-08T23:44:54.555 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:54.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:54.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673031 2026-03-08T23:44:54.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:54.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673031 2026-03-08T23:44:54.558 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673031 2026-03-08T23:44:54.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673031' 2026-03-08T23:44:54.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:54.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673032 -lt 42949673031 2026-03-08T23:44:54.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:54.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542209 2026-03-08T23:44:54.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:54.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:54.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542209 2026-03-08T23:44:54.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:54.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542209 2026-03-08T23:44:54.734 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542209 2026-03-08T23:44:54.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542209' 2026-03-08T23:44:54.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:54.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542209 -lt 60129542209 2026-03-08T23:44:54.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:54.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:54.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:55.079 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:55.098 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:55.098 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:55.286 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:55.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836559 2026-03-08T23:44:55.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836559 2026-03-08T23:44:55.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836559' 2026-03-08T23:44:55.380 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:55.380 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:55.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673034 2026-03-08T23:44:55.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673034 2026-03-08T23:44:55.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836559 1-42949673034' 2026-03-08T23:44:55.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:55.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:55.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542212 2026-03-08T23:44:55.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542212 2026-03-08T23:44:55.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836559 1-42949673034 2-60129542212' 2026-03-08T23:44:55.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:55.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836559 2026-03-08T23:44:55.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:55.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:55.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836559 2026-03-08T23:44:55.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:55.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836559 2026-03-08T23:44:55.587 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836559 2026-03-08T23:44:55.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836559' 2026-03-08T23:44:55.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:55.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836556 -lt 21474836559 2026-03-08T23:44:55.795 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:56.796 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:56.796 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:56.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836559 -lt 21474836559 2026-03-08T23:44:56.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:56.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673034 2026-03-08T23:44:56.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:56.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:44:56.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673034 2026-03-08T23:44:56.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:56.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673034 2026-03-08T23:44:56.969 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673034 2026-03-08T23:44:56.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673034' 2026-03-08T23:44:56.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:44:57.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673035 -lt 42949673034 2026-03-08T23:44:57.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:57.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542212 2026-03-08T23:44:57.146 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:57.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:44:57.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542212 2026-03-08T23:44:57.147 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:57.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542212 2026-03-08T23:44:57.148 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542212 2026-03-08T23:44:57.148 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542212' 2026-03-08T23:44:57.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:44:57.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542212 -lt 60129542212 2026-03-08T23:44:57.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:44:57.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:44:57.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:44:57.487 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:44:57.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:44:57.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:44:57.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:44:57.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:44:57.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:57.693 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:44:57.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836562 2026-03-08T23:44:57.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836562 2026-03-08T23:44:57.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836562' 2026-03-08T23:44:57.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:57.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:44:57.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673038 2026-03-08T23:44:57.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673038 2026-03-08T23:44:57.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836562 1-42949673038' 2026-03-08T23:44:57.872 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:44:57.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542215 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542215 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836562 1-42949673038 2-60129542215' 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836562 2026-03-08T23:44:57.959 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:44:57.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:44:57.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836562 2026-03-08T23:44:57.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:44:57.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836562 2026-03-08T23:44:57.961 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836562 2026-03-08T23:44:57.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836562' 2026-03-08T23:44:57.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:58.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836561 -lt 21474836562 2026-03-08T23:44:58.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:44:59.149 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:44:59.149 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:44:59.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836561 -lt 21474836562 2026-03-08T23:44:59.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:00.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:00.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:00.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836564 -lt 21474836562 2026-03-08T23:45:00.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:00.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673038 2026-03-08T23:45:00.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:00.488 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:00.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673038 2026-03-08T23:45:00.488 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:00.489 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673038 2026-03-08T23:45:00.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673038 2026-03-08T23:45:00.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673038' 2026-03-08T23:45:00.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:00.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673040 -lt 42949673038 2026-03-08T23:45:00.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:00.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542215 2026-03-08T23:45:00.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:00.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:00.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542215 2026-03-08T23:45:00.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:00.670 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542215 2026-03-08T23:45:00.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542215 2026-03-08T23:45:00.670 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542215' 2026-03-08T23:45:00.670 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542217 -lt 60129542215 2026-03-08T23:45:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:00.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:00.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:00.991 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:01.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:01.004 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:01.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:01.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:01.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:01.218 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:01.219 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:01.219 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:01.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:01.219 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:01.219 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:01.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836567 2026-03-08T23:45:01.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836567 2026-03-08T23:45:01.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567' 2026-03-08T23:45:01.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:01.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673042 2026-03-08T23:45:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673042 2026-03-08T23:45:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567 1-42949673042' 2026-03-08T23:45:01.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:01.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542220 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542220 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836567 1-42949673042 2-60129542220' 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836567 2026-03-08T23:45:01.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:01.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:01.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836567 2026-03-08T23:45:01.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:01.483 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836567 2026-03-08T23:45:01.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836567 2026-03-08T23:45:01.483 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836567' 2026-03-08T23:45:01.483 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:01.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836564 -lt 21474836567 2026-03-08T23:45:01.650 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:02.651 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:02.651 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:02.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836567 -lt 21474836567 2026-03-08T23:45:02.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:02.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673042 2026-03-08T23:45:02.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:02.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:02.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673042 2026-03-08T23:45:02.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:02.830 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673042 2026-03-08T23:45:02.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673042 2026-03-08T23:45:02.830 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673042' 2026-03-08T23:45:02.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:03.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673043 -lt 42949673042 2026-03-08T23:45:03.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:03.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542220 2026-03-08T23:45:03.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:03.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:03.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542220 2026-03-08T23:45:03.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:03.020 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542220 2026-03-08T23:45:03.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542220 2026-03-08T23:45:03.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542220' 2026-03-08T23:45:03.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:03.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542220 -lt 60129542220 2026-03-08T23:45:03.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:03.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:03.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:03.351 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:03.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:03.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:03.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:03.365 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:03.365 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:03.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:03.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836570 2026-03-08T23:45:03.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836570 2026-03-08T23:45:03.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570' 2026-03-08T23:45:03.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:03.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673046 2026-03-08T23:45:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673046 2026-03-08T23:45:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570 1-42949673046' 2026-03-08T23:45:03.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:03.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542223 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542223 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836570 1-42949673046 2-60129542223' 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836570 2026-03-08T23:45:03.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:03.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:03.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836570 2026-03-08T23:45:03.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:03.786 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836570 2026-03-08T23:45:03.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836570 2026-03-08T23:45:03.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836570' 2026-03-08T23:45:03.787 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:03.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836570 -lt 21474836570 2026-03-08T23:45:03.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:03.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673046 2026-03-08T23:45:03.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:03.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:03.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673046 2026-03-08T23:45:03.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:03.953 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673046 2026-03-08T23:45:03.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673046 2026-03-08T23:45:03.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673046' 2026-03-08T23:45:03.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673046 -lt 42949673046 2026-03-08T23:45:04.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542223 2026-03-08T23:45:04.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:04.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:04.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542223 2026-03-08T23:45:04.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:04.142 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542223 2026-03-08T23:45:04.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542223 2026-03-08T23:45:04.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542223' 2026-03-08T23:45:04.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:04.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542222 -lt 60129542223 2026-03-08T23:45:04.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:05.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:05.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:05.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542222 -lt 60129542223 2026-03-08T23:45:05.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:06.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:06.504 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:06.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542225 -lt 60129542223 2026-03-08T23:45:06.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:06.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:06.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:06.844 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:06.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:06.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:06.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:06.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:06.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:07.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:07.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836575 2026-03-08T23:45:07.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836575 2026-03-08T23:45:07.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836575' 2026-03-08T23:45:07.118 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:07.118 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:07.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673050 2026-03-08T23:45:07.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673050 2026-03-08T23:45:07.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836575 1-42949673050' 2026-03-08T23:45:07.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:07.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542228 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542228 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836575 1-42949673050 2-60129542228' 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836575 2026-03-08T23:45:07.285 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:07.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:07.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836575 2026-03-08T23:45:07.287 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:07.288 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836575 2026-03-08T23:45:07.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836575 2026-03-08T23:45:07.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836575' 2026-03-08T23:45:07.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:07.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836572 -lt 21474836575 2026-03-08T23:45:07.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:08.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:08.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:08.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836575 -lt 21474836575 2026-03-08T23:45:08.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:08.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673050 2026-03-08T23:45:08.634 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:08.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:08.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673050 2026-03-08T23:45:08.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:08.636 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673050 2026-03-08T23:45:08.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673050 2026-03-08T23:45:08.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673050' 2026-03-08T23:45:08.637 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673051 -lt 42949673050 2026-03-08T23:45:08.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:08.822 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542228 2026-03-08T23:45:08.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:08.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:08.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542228 2026-03-08T23:45:08.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:08.826 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542228 2026-03-08T23:45:08.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542228 2026-03-08T23:45:08.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542228' 2026-03-08T23:45:08.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:09.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542228 -lt 60129542228 2026-03-08T23:45:09.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:09.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:09.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:09.173 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:09.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:09.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:09.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:09.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:09.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:09.352 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836578 2026-03-08T23:45:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836578 2026-03-08T23:45:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836578' 2026-03-08T23:45:09.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:09.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:09.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673053 2026-03-08T23:45:09.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673053 2026-03-08T23:45:09.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836578 1-42949673053' 2026-03-08T23:45:09.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:09.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542231 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542231 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836578 1-42949673053 2-60129542231' 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836578 2026-03-08T23:45:09.597 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:09.598 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:09.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836578 2026-03-08T23:45:09.599 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:09.600 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836578 2026-03-08T23:45:09.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836578 2026-03-08T23:45:09.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836578' 2026-03-08T23:45:09.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:09.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836575 -lt 21474836578 2026-03-08T23:45:09.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:10.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:10.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836578 -lt 21474836578 2026-03-08T23:45:10.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:10.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673053 2026-03-08T23:45:10.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:10.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:10.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673053 2026-03-08T23:45:10.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:10.957 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673053 2026-03-08T23:45:10.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673053 2026-03-08T23:45:10.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673053' 2026-03-08T23:45:10.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:11.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673054 -lt 42949673053 2026-03-08T23:45:11.132 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:11.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542231 2026-03-08T23:45:11.132 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:11.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:11.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542231 2026-03-08T23:45:11.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:11.135 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542231 2026-03-08T23:45:11.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542231 2026-03-08T23:45:11.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542231' 2026-03-08T23:45:11.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:11.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542231 -lt 60129542231 2026-03-08T23:45:11.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:11.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:11.319 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:11.485 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:11.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:11.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:11.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:11.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:11.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:11.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:11.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836581 2026-03-08T23:45:11.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836581 2026-03-08T23:45:11.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836581' 2026-03-08T23:45:11.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:11.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:11.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673057 2026-03-08T23:45:11.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673057 2026-03-08T23:45:11.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836581 1-42949673057' 2026-03-08T23:45:11.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:11.835 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542234 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542234 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836581 1-42949673057 2-60129542234' 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836581 2026-03-08T23:45:11.906 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:11.907 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:11.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836581 2026-03-08T23:45:11.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:11.909 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836581 2026-03-08T23:45:11.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836581 2026-03-08T23:45:11.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836581' 2026-03-08T23:45:11.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:12.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836581 -lt 21474836581 2026-03-08T23:45:12.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:12.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673057 2026-03-08T23:45:12.083 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:12.084 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:12.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673057 2026-03-08T23:45:12.085 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:12.086 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673057 2026-03-08T23:45:12.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673057 2026-03-08T23:45:12.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673057' 2026-03-08T23:45:12.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:12.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673056 -lt 42949673057 2026-03-08T23:45:12.265 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:13.266 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:13.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:13.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673056 -lt 42949673057 2026-03-08T23:45:13.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:14.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:14.433 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:14.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673059 -lt 42949673057 2026-03-08T23:45:14.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:14.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542234 2026-03-08T23:45:14.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:14.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:14.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542234 2026-03-08T23:45:14.619 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:14.620 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542234 2026-03-08T23:45:14.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542234 2026-03-08T23:45:14.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542234' 2026-03-08T23:45:14.620 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:14.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542236 -lt 60129542234 2026-03-08T23:45:14.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:14.791 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:14.792 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:14.949 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:14.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:14.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:14.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:14.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:14.963 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:15.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:15.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836586 2026-03-08T23:45:15.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836586 2026-03-08T23:45:15.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836586' 2026-03-08T23:45:15.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:15.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:15.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673061 2026-03-08T23:45:15.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673061 2026-03-08T23:45:15.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836586 1-42949673061' 2026-03-08T23:45:15.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:15.297 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:15.373 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542239 2026-03-08T23:45:15.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542239 2026-03-08T23:45:15.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836586 1-42949673061 2-60129542239' 2026-03-08T23:45:15.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:15.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836586 2026-03-08T23:45:15.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:15.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:15.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836586 2026-03-08T23:45:15.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:15.376 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836586 2026-03-08T23:45:15.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836586 2026-03-08T23:45:15.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836586' 2026-03-08T23:45:15.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:15.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836583 -lt 21474836586 2026-03-08T23:45:15.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:16.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:16.541 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:16.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836586 -lt 21474836586 2026-03-08T23:45:16.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:16.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673061 2026-03-08T23:45:16.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:16.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:16.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673061 2026-03-08T23:45:16.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:16.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673061 2026-03-08T23:45:16.720 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673061 2026-03-08T23:45:16.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673061' 2026-03-08T23:45:16.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:16.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673062 -lt 42949673061 2026-03-08T23:45:16.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:16.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542239 2026-03-08T23:45:16.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:16.889 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:16.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542239 2026-03-08T23:45:16.890 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:16.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542239 2026-03-08T23:45:16.891 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542239 2026-03-08T23:45:16.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542239' 2026-03-08T23:45:16.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:17.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542239 -lt 60129542239 2026-03-08T23:45:17.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:17.060 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:17.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:17.210 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:17.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:17.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:17.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:17.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:17.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:17.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:17.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836589 2026-03-08T23:45:17.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836589 2026-03-08T23:45:17.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836589' 2026-03-08T23:45:17.461 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:17.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:17.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673064 2026-03-08T23:45:17.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673064 2026-03-08T23:45:17.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836589 1-42949673064' 2026-03-08T23:45:17.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:17.535 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542242 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542242 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836589 1-42949673064 2-60129542242' 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836589 2026-03-08T23:45:17.613 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:17.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:17.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836589 2026-03-08T23:45:17.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:17.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836589 2026-03-08T23:45:17.615 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836589 2026-03-08T23:45:17.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836589' 2026-03-08T23:45:17.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:17.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836586 -lt 21474836589 2026-03-08T23:45:17.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:18.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:18.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:18.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836589 -lt 21474836589 2026-03-08T23:45:18.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:18.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673064 2026-03-08T23:45:18.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:18.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:18.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673064 2026-03-08T23:45:18.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:18.952 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673064 2026-03-08T23:45:18.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673064 2026-03-08T23:45:18.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673064' 2026-03-08T23:45:18.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:19.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673065 -lt 42949673064 2026-03-08T23:45:19.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:19.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542242 2026-03-08T23:45:19.126 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:19.128 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:19.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542242 2026-03-08T23:45:19.128 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:19.130 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542242 2026-03-08T23:45:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542242 2026-03-08T23:45:19.130 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542242' 2026-03-08T23:45:19.130 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:19.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542242 -lt 60129542242 2026-03-08T23:45:19.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:19.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:19.308 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:19.461 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:19.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:19.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:19.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:19.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:19.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:19.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:19.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836592 2026-03-08T23:45:19.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836592 2026-03-08T23:45:19.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592' 2026-03-08T23:45:19.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:19.712 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:19.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673068 2026-03-08T23:45:19.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673068 2026-03-08T23:45:19.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592 1-42949673068' 2026-03-08T23:45:19.793 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:19.793 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542245 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542245 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836592 1-42949673068 2-60129542245' 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836592 2026-03-08T23:45:19.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:19.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:19.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836592 2026-03-08T23:45:19.878 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:19.879 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836592 2026-03-08T23:45:19.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836592 2026-03-08T23:45:19.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836592' 2026-03-08T23:45:19.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:20.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836592 -lt 21474836592 2026-03-08T23:45:20.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:20.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673068 2026-03-08T23:45:20.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:20.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:20.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673068 2026-03-08T23:45:20.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:20.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673068 2026-03-08T23:45:20.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673068' 2026-03-08T23:45:20.058 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673068 2026-03-08T23:45:20.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:20.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673067 -lt 42949673068 2026-03-08T23:45:20.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:21.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:21.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673067 -lt 42949673068 2026-03-08T23:45:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:22.393 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:22.393 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:22.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673070 -lt 42949673068 2026-03-08T23:45:22.570 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:22.570 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542245 2026-03-08T23:45:22.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:22.572 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:22.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542245 2026-03-08T23:45:22.572 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:22.573 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542245 2026-03-08T23:45:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542245 2026-03-08T23:45:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542245' 2026-03-08T23:45:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542247 -lt 60129542245 2026-03-08T23:45:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:22.753 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:22.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:22.911 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:22.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:22.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:22.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:22.924 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:22.924 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:23.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:23.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836597 2026-03-08T23:45:23.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836597 2026-03-08T23:45:23.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836597' 2026-03-08T23:45:23.174 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:23.174 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673072 2026-03-08T23:45:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673072 2026-03-08T23:45:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836597 1-42949673072' 2026-03-08T23:45:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:23.254 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542250 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542250 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836597 1-42949673072 2-60129542250' 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836597 2026-03-08T23:45:23.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:23.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:23.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836597 2026-03-08T23:45:23.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:23.336 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836597 2026-03-08T23:45:23.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836597 2026-03-08T23:45:23.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836597' 2026-03-08T23:45:23.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:23.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836594 -lt 21474836597 2026-03-08T23:45:23.504 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:24.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:24.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:24.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836597 -lt 21474836597 2026-03-08T23:45:24.674 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:24.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673072 2026-03-08T23:45:24.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:24.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:24.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673072 2026-03-08T23:45:24.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:24.677 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673072 2026-03-08T23:45:24.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673072 2026-03-08T23:45:24.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673072' 2026-03-08T23:45:24.678 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:24.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673073 -lt 42949673072 2026-03-08T23:45:24.857 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:24.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542250 2026-03-08T23:45:24.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:24.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:24.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542250 2026-03-08T23:45:24.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:24.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542250 2026-03-08T23:45:24.860 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542250 2026-03-08T23:45:24.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542250' 2026-03-08T23:45:24.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542250 -lt 60129542250 2026-03-08T23:45:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:25.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:25.189 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:25.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:25.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:25.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:25.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:25.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:25.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:25.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836600 2026-03-08T23:45:25.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836600 2026-03-08T23:45:25.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836600' 2026-03-08T23:45:25.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:25.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:25.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673075 2026-03-08T23:45:25.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673075 2026-03-08T23:45:25.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836600 1-42949673075' 2026-03-08T23:45:25.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:25.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542253 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542253 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836600 1-42949673075 2-60129542253' 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836600 2026-03-08T23:45:25.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:25.608 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:25.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836600 2026-03-08T23:45:25.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:25.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836600 2026-03-08T23:45:25.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836600' 2026-03-08T23:45:25.610 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836600 2026-03-08T23:45:25.610 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:25.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836597 -lt 21474836600 2026-03-08T23:45:25.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:26.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:26.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:26.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836600 -lt 21474836600 2026-03-08T23:45:26.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:26.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673075 2026-03-08T23:45:26.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:26.946 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:26.946 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673075 2026-03-08T23:45:26.947 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:26.947 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673075 2026-03-08T23:45:26.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673075 2026-03-08T23:45:26.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673075' 2026-03-08T23:45:26.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:27.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673076 -lt 42949673075 2026-03-08T23:45:27.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:27.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542253 2026-03-08T23:45:27.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:27.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:27.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542253 2026-03-08T23:45:27.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:27.113 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542253 2026-03-08T23:45:27.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542253 2026-03-08T23:45:27.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542253' 2026-03-08T23:45:27.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:27.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542253 -lt 60129542253 2026-03-08T23:45:27.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:27.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:27.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:27.421 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:27.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:27.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:27.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:27.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:27.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:27.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836603 2026-03-08T23:45:27.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836603 2026-03-08T23:45:27.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836603' 2026-03-08T23:45:27.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:27.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673079 2026-03-08T23:45:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673079 2026-03-08T23:45:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836603 1-42949673079' 2026-03-08T23:45:27.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:27.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542256 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542256 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836603 1-42949673079 2-60129542256' 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836603 2026-03-08T23:45:27.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:27.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:27.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836603 2026-03-08T23:45:27.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:27.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836603 2026-03-08T23:45:27.843 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836603 2026-03-08T23:45:27.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836603' 2026-03-08T23:45:27.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:28.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836603 -lt 21474836603 2026-03-08T23:45:28.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:28.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673079 2026-03-08T23:45:28.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:28.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:28.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673079 2026-03-08T23:45:28.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:28.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673079 2026-03-08T23:45:28.010 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673079 2026-03-08T23:45:28.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673079' 2026-03-08T23:45:28.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:28.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673078 -lt 42949673079 2026-03-08T23:45:28.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:29.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:29.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:29.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673078 -lt 42949673079 2026-03-08T23:45:29.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:30.346 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:30.347 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:30.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673081 -lt 42949673079 2026-03-08T23:45:30.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:30.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542256 2026-03-08T23:45:30.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:30.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:30.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542256 2026-03-08T23:45:30.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:30.513 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542256 2026-03-08T23:45:30.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542256 2026-03-08T23:45:30.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542256' 2026-03-08T23:45:30.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:30.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542258 -lt 60129542256 2026-03-08T23:45:30.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:30.675 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:30.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:30.829 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:30.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:30.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:30.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:30.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:30.843 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:31.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:31.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836608 2026-03-08T23:45:31.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836608 2026-03-08T23:45:31.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836608' 2026-03-08T23:45:31.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:31.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:31.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673083 2026-03-08T23:45:31.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673083 2026-03-08T23:45:31.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836608 1-42949673083' 2026-03-08T23:45:31.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:31.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542261 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542261 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836608 1-42949673083 2-60129542261' 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836608 2026-03-08T23:45:31.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:31.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:31.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836608 2026-03-08T23:45:31.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:31.253 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836608 2026-03-08T23:45:31.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836608 2026-03-08T23:45:31.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836608' 2026-03-08T23:45:31.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:31.425 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836605 -lt 21474836608 2026-03-08T23:45:31.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:32.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:32.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:32.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836608 -lt 21474836608 2026-03-08T23:45:32.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:32.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673083 2026-03-08T23:45:32.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:32.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:32.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673083 2026-03-08T23:45:32.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:32.590 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673083 2026-03-08T23:45:32.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673083 2026-03-08T23:45:32.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673083' 2026-03-08T23:45:32.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:32.755 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673084 -lt 42949673083 2026-03-08T23:45:32.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:32.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542261 2026-03-08T23:45:32.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:32.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:32.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542261 2026-03-08T23:45:32.757 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:32.758 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542261 2026-03-08T23:45:32.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542261 2026-03-08T23:45:32.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542261' 2026-03-08T23:45:32.758 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:32.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542261 -lt 60129542261 2026-03-08T23:45:32.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:32.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:32.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:33.076 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:33.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:33.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:33.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:33.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836611 2026-03-08T23:45:33.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836611 2026-03-08T23:45:33.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836611' 2026-03-08T23:45:33.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:33.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:33.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673086 2026-03-08T23:45:33.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673086 2026-03-08T23:45:33.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836611 1-42949673086' 2026-03-08T23:45:33.435 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:33.435 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542264 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542264 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836611 1-42949673086 2-60129542264' 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836611 2026-03-08T23:45:33.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:33.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:33.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836611 2026-03-08T23:45:33.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:33.516 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836611 2026-03-08T23:45:33.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836611 2026-03-08T23:45:33.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836611' 2026-03-08T23:45:33.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:33.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836608 -lt 21474836611 2026-03-08T23:45:33.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:34.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:34.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:34.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836611 -lt 21474836611 2026-03-08T23:45:34.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:34.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673086 2026-03-08T23:45:34.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:34.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:34.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673086 2026-03-08T23:45:34.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:34.857 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673086 2026-03-08T23:45:34.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673086 2026-03-08T23:45:34.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673086' 2026-03-08T23:45:34.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:35.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673087 -lt 42949673086 2026-03-08T23:45:35.028 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:35.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542264 2026-03-08T23:45:35.028 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:35.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:35.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542264 2026-03-08T23:45:35.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:35.031 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542264 2026-03-08T23:45:35.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542264 2026-03-08T23:45:35.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542264' 2026-03-08T23:45:35.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:35.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542264 -lt 60129542264 2026-03-08T23:45:35.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:35.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:35.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:35.353 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:35.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:35.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:35.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:35.366 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:35.366 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:35.529 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:35.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836614 2026-03-08T23:45:35.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836614 2026-03-08T23:45:35.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836614' 2026-03-08T23:45:35.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:35.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673089 2026-03-08T23:45:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673089 2026-03-08T23:45:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836614 1-42949673089' 2026-03-08T23:45:35.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:35.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:35.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542267 2026-03-08T23:45:35.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542267 2026-03-08T23:45:35.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836614 1-42949673089 2-60129542267' 2026-03-08T23:45:35.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:35.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836614 2026-03-08T23:45:35.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:35.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:35.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836614 2026-03-08T23:45:35.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:35.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836614 2026-03-08T23:45:35.770 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836614 2026-03-08T23:45:35.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836614' 2026-03-08T23:45:35.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:35.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836614 -lt 21474836614 2026-03-08T23:45:35.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:35.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673089 2026-03-08T23:45:35.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:35.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:35.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673089 2026-03-08T23:45:35.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:35.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673089 2026-03-08T23:45:35.940 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673089 2026-03-08T23:45:35.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673089' 2026-03-08T23:45:35.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:36.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673090 -lt 42949673089 2026-03-08T23:45:36.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:36.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542267 2026-03-08T23:45:36.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:36.103 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:36.103 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542267 2026-03-08T23:45:36.104 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:36.104 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542267 2026-03-08T23:45:36.105 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542267 2026-03-08T23:45:36.105 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542267' 2026-03-08T23:45:36.105 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:36.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542266 -lt 60129542267 2026-03-08T23:45:36.273 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:37.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:37.275 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:37.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542266 -lt 60129542267 2026-03-08T23:45:37.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:38.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:38.441 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542269 -lt 60129542267 2026-03-08T23:45:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:38.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:38.615 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:38.772 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:38.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:38.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:38.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:38.785 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:38.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:38.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:39.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836619 2026-03-08T23:45:39.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836619 2026-03-08T23:45:39.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619' 2026-03-08T23:45:39.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:39.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:39.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673094 2026-03-08T23:45:39.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673094 2026-03-08T23:45:39.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619 1-42949673094' 2026-03-08T23:45:39.111 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:39.111 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542272 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542272 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836619 1-42949673094 2-60129542272' 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836619 2026-03-08T23:45:39.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:39.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:39.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836619 2026-03-08T23:45:39.189 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:39.190 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836619 2026-03-08T23:45:39.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836619 2026-03-08T23:45:39.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836619' 2026-03-08T23:45:39.190 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:39.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836616 -lt 21474836619 2026-03-08T23:45:39.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:40.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:40.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:40.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836619 -lt 21474836619 2026-03-08T23:45:40.533 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:40.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673094 2026-03-08T23:45:40.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:40.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:40.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673094 2026-03-08T23:45:40.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:40.536 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673094 2026-03-08T23:45:40.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673094 2026-03-08T23:45:40.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673094' 2026-03-08T23:45:40.536 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:40.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673095 -lt 42949673094 2026-03-08T23:45:40.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:40.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542272 2026-03-08T23:45:40.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:40.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:40.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542272 2026-03-08T23:45:40.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:40.716 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542272 2026-03-08T23:45:40.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542272 2026-03-08T23:45:40.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542272' 2026-03-08T23:45:40.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:40.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542272 -lt 60129542272 2026-03-08T23:45:40.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:40.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:40.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:41.049 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:41.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:41.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:41.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:41.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:41.062 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:41.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:41.235 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:41.235 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:41.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:41.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:41.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:41.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836622 2026-03-08T23:45:41.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836622 2026-03-08T23:45:41.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622' 2026-03-08T23:45:41.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:41.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:41.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673097 2026-03-08T23:45:41.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673097 2026-03-08T23:45:41.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622 1-42949673097' 2026-03-08T23:45:41.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:41.398 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542275 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542275 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836622 1-42949673097 2-60129542275' 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836622 2026-03-08T23:45:41.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:41.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:41.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836622 2026-03-08T23:45:41.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:41.479 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836622 2026-03-08T23:45:41.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836622 2026-03-08T23:45:41.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836622' 2026-03-08T23:45:41.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:41.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836619 -lt 21474836622 2026-03-08T23:45:41.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:42.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:42.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:42.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836622 -lt 21474836622 2026-03-08T23:45:42.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:42.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673097 2026-03-08T23:45:42.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:42.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:42.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673097 2026-03-08T23:45:42.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:42.828 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673097 2026-03-08T23:45:42.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673097 2026-03-08T23:45:42.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673097' 2026-03-08T23:45:42.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:43.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673098 -lt 42949673097 2026-03-08T23:45:43.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:43.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542275 2026-03-08T23:45:43.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:43.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:43.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542275 2026-03-08T23:45:43.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:43.009 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542275 2026-03-08T23:45:43.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542275 2026-03-08T23:45:43.010 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542275' 2026-03-08T23:45:43.010 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:43.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542275 -lt 60129542275 2026-03-08T23:45:43.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:43.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:43.177 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:43.347 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:43.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:43.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:43.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:43.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:43.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:43.541 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:43.542 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:43.542 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:43.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:43.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:43.542 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:43.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836625 2026-03-08T23:45:43.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836625 2026-03-08T23:45:43.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836625' 2026-03-08T23:45:43.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:43.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:43.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673101 2026-03-08T23:45:43.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673101 2026-03-08T23:45:43.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836625 1-42949673101' 2026-03-08T23:45:43.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:43.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:43.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542278 2026-03-08T23:45:43.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542278 2026-03-08T23:45:43.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836625 1-42949673101 2-60129542278' 2026-03-08T23:45:43.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:43.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836625 2026-03-08T23:45:43.830 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:43.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:43.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836625 2026-03-08T23:45:43.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:43.832 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836625 2026-03-08T23:45:43.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836625 2026-03-08T23:45:43.832 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836625' 2026-03-08T23:45:43.832 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836625 -lt 21474836625 2026-03-08T23:45:44.013 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:44.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673101 2026-03-08T23:45:44.013 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:44.014 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:44.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673101 2026-03-08T23:45:44.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:44.015 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673101 2026-03-08T23:45:44.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673101 2026-03-08T23:45:44.015 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673101' 2026-03-08T23:45:44.016 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:44.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673101 -lt 42949673101 2026-03-08T23:45:44.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:44.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:44.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542278 2026-03-08T23:45:44.195 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:44.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542278 2026-03-08T23:45:44.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:44.197 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542278 2026-03-08T23:45:44.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542278 2026-03-08T23:45:44.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542278' 2026-03-08T23:45:44.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:44.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542277 -lt 60129542278 2026-03-08T23:45:44.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:45.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:45.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:45.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542277 -lt 60129542278 2026-03-08T23:45:45.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:46.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:46.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:46.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542280 -lt 60129542278 2026-03-08T23:45:46.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:46.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:46.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:46.962 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:46.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:46.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:46.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:46.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:46.979 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:47.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:47.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836630 2026-03-08T23:45:47.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836630 2026-03-08T23:45:47.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630' 2026-03-08T23:45:47.250 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:47.250 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:47.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673105 2026-03-08T23:45:47.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673105 2026-03-08T23:45:47.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630 1-42949673105' 2026-03-08T23:45:47.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:47.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:47.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542283 2026-03-08T23:45:47.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542283 2026-03-08T23:45:47.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836630 1-42949673105 2-60129542283' 2026-03-08T23:45:47.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:47.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836630 2026-03-08T23:45:47.474 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:47.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:47.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836630 2026-03-08T23:45:47.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:47.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836630 2026-03-08T23:45:47.476 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836630 2026-03-08T23:45:47.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836630' 2026-03-08T23:45:47.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:47.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836627 -lt 21474836630 2026-03-08T23:45:47.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:48.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:48.668 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:48.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836630 -lt 21474836630 2026-03-08T23:45:48.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:48.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673105 2026-03-08T23:45:48.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:48.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:48.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673105 2026-03-08T23:45:48.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:48.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673105 2026-03-08T23:45:48.860 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673105 2026-03-08T23:45:48.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673105' 2026-03-08T23:45:48.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:49.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673106 -lt 42949673105 2026-03-08T23:45:49.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:49.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542283 2026-03-08T23:45:49.051 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:49.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:49.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542283 2026-03-08T23:45:49.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:49.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542283 2026-03-08T23:45:49.054 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542283 2026-03-08T23:45:49.054 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542283' 2026-03-08T23:45:49.054 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542283 -lt 60129542283 2026-03-08T23:45:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:49.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:49.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:49.388 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:49.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:49.403 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:49.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:49.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:49.404 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:49.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836633 2026-03-08T23:45:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836633 2026-03-08T23:45:49.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633' 2026-03-08T23:45:49.665 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:49.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:49.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673109 2026-03-08T23:45:49.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673109 2026-03-08T23:45:49.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633 1-42949673109' 2026-03-08T23:45:49.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:49.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:49.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542286 2026-03-08T23:45:49.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542286 2026-03-08T23:45:49.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836633 1-42949673109 2-60129542286' 2026-03-08T23:45:49.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:49.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836633 2026-03-08T23:45:49.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:49.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:49.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836633 2026-03-08T23:45:49.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:49.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836633 2026-03-08T23:45:49.842 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836633 2026-03-08T23:45:49.842 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836633' 2026-03-08T23:45:49.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:50.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836633 -lt 21474836633 2026-03-08T23:45:50.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:50.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673109 2026-03-08T23:45:50.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:50.022 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:50.022 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673109 2026-03-08T23:45:50.023 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:50.024 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673109 2026-03-08T23:45:50.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673109 2026-03-08T23:45:50.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673109' 2026-03-08T23:45:50.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:50.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673109 -lt 42949673109 2026-03-08T23:45:50.199 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:50.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542286 2026-03-08T23:45:50.199 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:50.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:50.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542286 2026-03-08T23:45:50.201 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:50.202 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542286 2026-03-08T23:45:50.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542286 2026-03-08T23:45:50.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542286' 2026-03-08T23:45:50.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:50.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542285 -lt 60129542286 2026-03-08T23:45:50.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:51.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:51.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:51.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542285 -lt 60129542286 2026-03-08T23:45:51.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:52.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:52.552 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:52.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542288 -lt 60129542286 2026-03-08T23:45:52.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:52.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:52.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:52.893 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:52.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:52.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:52.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:52.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:52.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:53.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:53.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836638 2026-03-08T23:45:53.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836638 2026-03-08T23:45:53.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836638' 2026-03-08T23:45:53.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:53.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:53.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673113 2026-03-08T23:45:53.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673113 2026-03-08T23:45:53.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836638 1-42949673113' 2026-03-08T23:45:53.269 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:53.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:53.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542291 2026-03-08T23:45:53.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542291 2026-03-08T23:45:53.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836638 1-42949673113 2-60129542291' 2026-03-08T23:45:53.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:53.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836638 2026-03-08T23:45:53.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:53.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:53.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836638 2026-03-08T23:45:53.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:53.363 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836638 2026-03-08T23:45:53.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836638 2026-03-08T23:45:53.364 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836638' 2026-03-08T23:45:53.364 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:53.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836635 -lt 21474836638 2026-03-08T23:45:53.580 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:54.581 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:54.582 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:54.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836638 -lt 21474836638 2026-03-08T23:45:54.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:54.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673113 2026-03-08T23:45:54.764 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:54.765 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:54.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673113 2026-03-08T23:45:54.765 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:54.766 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673113 2026-03-08T23:45:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673113 2026-03-08T23:45:54.766 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673113' 2026-03-08T23:45:54.767 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673114 -lt 42949673113 2026-03-08T23:45:54.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:54.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542291 2026-03-08T23:45:54.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:54.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:54.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542291 2026-03-08T23:45:54.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:54.952 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542291 2026-03-08T23:45:54.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542291 2026-03-08T23:45:54.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542291' 2026-03-08T23:45:54.952 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:55.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542291 -lt 60129542291 2026-03-08T23:45:55.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:55.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:55.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:55.320 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:55.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:55.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:55.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:55.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:55.335 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:55.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:55.519 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:55.519 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:55.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:55.519 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836641 2026-03-08T23:45:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836641 2026-03-08T23:45:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641' 2026-03-08T23:45:55.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:55.604 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673116 2026-03-08T23:45:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673116 2026-03-08T23:45:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641 1-42949673116' 2026-03-08T23:45:55.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:55.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:55.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542294 2026-03-08T23:45:55.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542294 2026-03-08T23:45:55.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836641 1-42949673116 2-60129542294' 2026-03-08T23:45:55.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:55.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836641 2026-03-08T23:45:55.784 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:55.784 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:55.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836641 2026-03-08T23:45:55.785 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:55.786 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836641 2026-03-08T23:45:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836641 2026-03-08T23:45:55.786 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836641' 2026-03-08T23:45:55.786 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:55.973 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836641 -lt 21474836641 2026-03-08T23:45:55.974 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:55.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673116 2026-03-08T23:45:55.974 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:55.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:45:55.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673116 2026-03-08T23:45:55.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673116 2026-03-08T23:45:55.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673116' 2026-03-08T23:45:55.977 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673116 2026-03-08T23:45:55.977 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:45:56.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673117 -lt 42949673116 2026-03-08T23:45:56.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542294 2026-03-08T23:45:56.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:56.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:45:56.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542294 2026-03-08T23:45:56.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:56.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542294 2026-03-08T23:45:56.168 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542294 2026-03-08T23:45:56.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542294' 2026-03-08T23:45:56.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:56.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542293 -lt 60129542294 2026-03-08T23:45:56.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:57.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:45:57.359 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:57.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542293 -lt 60129542294 2026-03-08T23:45:57.536 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:45:58.537 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:45:58.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:45:58.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542296 -lt 60129542294 2026-03-08T23:45:58.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:45:58.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:45:58.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:45:58.884 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:45:58.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:45:58.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:45:58.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:45:58.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:45:58.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:45:59.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:45:59.073 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:45:59.073 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:45:59.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:45:59.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:59.074 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:45:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836646 2026-03-08T23:45:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836646 2026-03-08T23:45:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836646' 2026-03-08T23:45:59.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:59.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:45:59.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673121 2026-03-08T23:45:59.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673121 2026-03-08T23:45:59.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836646 1-42949673121' 2026-03-08T23:45:59.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:45:59.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:45:59.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542299 2026-03-08T23:45:59.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542299 2026-03-08T23:45:59.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836646 1-42949673121 2-60129542299' 2026-03-08T23:45:59.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:45:59.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836646 2026-03-08T23:45:59.337 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:45:59.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:45:59.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836646 2026-03-08T23:45:59.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:45:59.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836646 2026-03-08T23:45:59.339 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836646 2026-03-08T23:45:59.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836646' 2026-03-08T23:45:59.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:45:59.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836643 -lt 21474836646 2026-03-08T23:45:59.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:00.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:00.512 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:00.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836646 -lt 21474836646 2026-03-08T23:46:00.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:00.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673121 2026-03-08T23:46:00.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:00.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:00.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673121 2026-03-08T23:46:00.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:00.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673121 2026-03-08T23:46:00.692 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673121' 2026-03-08T23:46:00.692 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673121 2026-03-08T23:46:00.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:00.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673122 -lt 42949673121 2026-03-08T23:46:00.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:00.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542299 2026-03-08T23:46:00.874 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:00.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:00.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542299 2026-03-08T23:46:00.875 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:00.876 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542299 2026-03-08T23:46:00.876 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542299 2026-03-08T23:46:00.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542299' 2026-03-08T23:46:00.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:01.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542299 -lt 60129542299 2026-03-08T23:46:01.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:01.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:01.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:01.241 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:01.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:01.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:01.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:01.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:01.255 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:01.441 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836649 2026-03-08T23:46:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836649 2026-03-08T23:46:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836649' 2026-03-08T23:46:01.534 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:01.534 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:01.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673124 2026-03-08T23:46:01.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673124 2026-03-08T23:46:01.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836649 1-42949673124' 2026-03-08T23:46:01.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:01.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:01.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542302 2026-03-08T23:46:01.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542302 2026-03-08T23:46:01.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836649 1-42949673124 2-60129542302' 2026-03-08T23:46:01.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:01.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:01.701 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836649 2026-03-08T23:46:01.702 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:01.702 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836649 2026-03-08T23:46:01.703 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:01.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836649 2026-03-08T23:46:01.703 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836649 2026-03-08T23:46:01.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836649' 2026-03-08T23:46:01.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:01.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836649 -lt 21474836649 2026-03-08T23:46:01.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:01.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673124 2026-03-08T23:46:01.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:01.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:01.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673124 2026-03-08T23:46:01.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:01.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673124 2026-03-08T23:46:01.881 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673124 2026-03-08T23:46:01.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673124' 2026-03-08T23:46:01.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:02.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673125 -lt 42949673124 2026-03-08T23:46:02.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:02.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542302 2026-03-08T23:46:02.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:02.065 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:02.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542302 2026-03-08T23:46:02.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:02.066 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542302 2026-03-08T23:46:02.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542302 2026-03-08T23:46:02.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542302' 2026-03-08T23:46:02.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:02.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542302 -lt 60129542302 2026-03-08T23:46:02.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:02.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:02.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:02.403 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:02.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:02.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:02.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:02.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:02.416 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:02.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836651 2026-03-08T23:46:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836651 2026-03-08T23:46:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836651' 2026-03-08T23:46:02.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:02.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:02.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673127 2026-03-08T23:46:02.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673127 2026-03-08T23:46:02.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836651 1-42949673127' 2026-03-08T23:46:02.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:02.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:02.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542304 2026-03-08T23:46:02.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542304 2026-03-08T23:46:02.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836651 1-42949673127 2-60129542304' 2026-03-08T23:46:02.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:02.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836651 2026-03-08T23:46:02.855 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:02.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:02.856 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836651 2026-03-08T23:46:02.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:02.858 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836651 2026-03-08T23:46:02.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836651 2026-03-08T23:46:02.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836651' 2026-03-08T23:46:02.858 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:03.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836649 -lt 21474836651 2026-03-08T23:46:03.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:04.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:04.038 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:04.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836652 -lt 21474836651 2026-03-08T23:46:04.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:04.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673127 2026-03-08T23:46:04.223 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:04.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:04.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673127 2026-03-08T23:46:04.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:04.225 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673127 2026-03-08T23:46:04.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673127 2026-03-08T23:46:04.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673127' 2026-03-08T23:46:04.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673128 -lt 42949673127 2026-03-08T23:46:04.417 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:04.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542304 2026-03-08T23:46:04.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:04.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:04.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542304 2026-03-08T23:46:04.419 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:04.420 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542304 2026-03-08T23:46:04.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542304 2026-03-08T23:46:04.420 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542304' 2026-03-08T23:46:04.420 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:04.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542305 -lt 60129542304 2026-03-08T23:46:04.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:04.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:04.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:04.794 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:04.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:04.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:04.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:04.813 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:04.813 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:04.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:04.998 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:04.998 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:04.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:04.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:04.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:05.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836655 2026-03-08T23:46:05.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836655 2026-03-08T23:46:05.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655' 2026-03-08T23:46:05.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:05.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:05.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673130 2026-03-08T23:46:05.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673130 2026-03-08T23:46:05.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655 1-42949673130' 2026-03-08T23:46:05.182 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:05.182 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542308 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542308 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836655 1-42949673130 2-60129542308' 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836655 2026-03-08T23:46:05.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:05.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:05.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836655 2026-03-08T23:46:05.269 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:05.270 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836655 2026-03-08T23:46:05.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836655 2026-03-08T23:46:05.270 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836655' 2026-03-08T23:46:05.270 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:05.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836652 -lt 21474836655 2026-03-08T23:46:05.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:06.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:06.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:06.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836655 -lt 21474836655 2026-03-08T23:46:06.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:06.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673130 2026-03-08T23:46:06.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:06.654 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:06.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673130 2026-03-08T23:46:06.654 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:06.655 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673130 2026-03-08T23:46:06.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673130 2026-03-08T23:46:06.655 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673130' 2026-03-08T23:46:06.655 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:06.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673131 -lt 42949673130 2026-03-08T23:46:06.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:06.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542308 2026-03-08T23:46:06.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:06.839 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:06.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542308 2026-03-08T23:46:06.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:06.840 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542308 2026-03-08T23:46:06.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542308 2026-03-08T23:46:06.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542308' 2026-03-08T23:46:06.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:07.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542308 -lt 60129542308 2026-03-08T23:46:07.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:07.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:07.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:07.189 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:07.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:07.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:07.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:07.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:07.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:07.383 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:07.383 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:07.383 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:07.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:07.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:07.384 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836658 2026-03-08T23:46:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836658 2026-03-08T23:46:07.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658' 2026-03-08T23:46:07.487 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:07.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:07.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673133 2026-03-08T23:46:07.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673133 2026-03-08T23:46:07.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658 1-42949673133' 2026-03-08T23:46:07.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:07.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542311 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542311 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836658 1-42949673133 2-60129542311' 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836658 2026-03-08T23:46:07.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:07.690 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:07.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836658 2026-03-08T23:46:07.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:07.691 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836658 2026-03-08T23:46:07.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836658 2026-03-08T23:46:07.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836658' 2026-03-08T23:46:07.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:07.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836658 -lt 21474836658 2026-03-08T23:46:07.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:07.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673133 2026-03-08T23:46:07.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:07.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:07.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673133 2026-03-08T23:46:07.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:07.884 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673133 2026-03-08T23:46:07.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673133 2026-03-08T23:46:07.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673133' 2026-03-08T23:46:07.884 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:08.074 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673134 -lt 42949673133 2026-03-08T23:46:08.075 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:08.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542311 2026-03-08T23:46:08.075 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:08.076 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:08.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542311 2026-03-08T23:46:08.076 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:08.077 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542311 2026-03-08T23:46:08.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542311 2026-03-08T23:46:08.077 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542311' 2026-03-08T23:46:08.077 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:08.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542311 -lt 60129542311 2026-03-08T23:46:08.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:08.285 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:08.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:08.452 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:08.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:08.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:08.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:08.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836660 2026-03-08T23:46:08.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836660 2026-03-08T23:46:08.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836660' 2026-03-08T23:46:08.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:08.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:08.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673136 2026-03-08T23:46:08.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673136 2026-03-08T23:46:08.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836660 1-42949673136' 2026-03-08T23:46:08.829 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:08.829 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542313 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542313 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836660 1-42949673136 2-60129542313' 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836660 2026-03-08T23:46:08.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:08.921 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:08.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836660 2026-03-08T23:46:08.922 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:08.923 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836660 2026-03-08T23:46:08.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836660 2026-03-08T23:46:08.923 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836660' 2026-03-08T23:46:08.923 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836658 -lt 21474836660 2026-03-08T23:46:09.122 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:10.123 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:10.124 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:10.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836661 -lt 21474836660 2026-03-08T23:46:10.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:10.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673136 2026-03-08T23:46:10.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:10.305 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:10.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673136 2026-03-08T23:46:10.305 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:10.307 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673136 2026-03-08T23:46:10.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673136 2026-03-08T23:46:10.307 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673136' 2026-03-08T23:46:10.307 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:10.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673137 -lt 42949673136 2026-03-08T23:46:10.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:10.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542313 2026-03-08T23:46:10.514 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:10.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:10.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542313 2026-03-08T23:46:10.515 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:10.516 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542313 2026-03-08T23:46:10.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542313 2026-03-08T23:46:10.517 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542313' 2026-03-08T23:46:10.517 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:10.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542314 -lt 60129542313 2026-03-08T23:46:10.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:10.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:10.701 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:10.869 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:10.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:10.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:10.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:10.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:10.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:11.067 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:11.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836664 2026-03-08T23:46:11.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836664 2026-03-08T23:46:11.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664' 2026-03-08T23:46:11.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:11.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673139 2026-03-08T23:46:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673139 2026-03-08T23:46:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664 1-42949673139' 2026-03-08T23:46:11.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:11.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542317 2026-03-08T23:46:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542317 2026-03-08T23:46:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836664 1-42949673139 2-60129542317' 2026-03-08T23:46:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:11.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836664 2026-03-08T23:46:11.326 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:11.327 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:11.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836664 2026-03-08T23:46:11.327 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:11.328 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836664 2026-03-08T23:46:11.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836664 2026-03-08T23:46:11.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836664' 2026-03-08T23:46:11.328 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:11.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836661 -lt 21474836664 2026-03-08T23:46:11.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:12.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:12.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836664 -lt 21474836664 2026-03-08T23:46:12.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673139 2026-03-08T23:46:12.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:12.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:12.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673139 2026-03-08T23:46:12.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:12.691 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673139 2026-03-08T23:46:12.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673139 2026-03-08T23:46:12.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673139' 2026-03-08T23:46:12.691 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:12.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673140 -lt 42949673139 2026-03-08T23:46:12.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:12.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542317 2026-03-08T23:46:12.870 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:12.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:12.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542317 2026-03-08T23:46:12.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:12.873 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542317 2026-03-08T23:46:12.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542317 2026-03-08T23:46:12.873 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542317' 2026-03-08T23:46:12.873 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:13.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542317 -lt 60129542317 2026-03-08T23:46:13.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:13.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:13.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:13.231 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:13.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:13.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:13.426 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836667 2026-03-08T23:46:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836667 2026-03-08T23:46:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667' 2026-03-08T23:46:13.516 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:13.516 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:13.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673142 2026-03-08T23:46:13.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673142 2026-03-08T23:46:13.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667 1-42949673142' 2026-03-08T23:46:13.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:13.600 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:13.683 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542320 2026-03-08T23:46:13.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542320 2026-03-08T23:46:13.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836667 1-42949673142 2-60129542320' 2026-03-08T23:46:13.684 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:13.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836667 2026-03-08T23:46:13.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:13.685 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:13.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836667 2026-03-08T23:46:13.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:13.686 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836667 2026-03-08T23:46:13.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836667 2026-03-08T23:46:13.686 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836667' 2026-03-08T23:46:13.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:13.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836667 -lt 21474836667 2026-03-08T23:46:13.877 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:13.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673142 2026-03-08T23:46:13.877 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:13.878 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:13.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673142 2026-03-08T23:46:13.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:13.880 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673142 2026-03-08T23:46:13.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673142 2026-03-08T23:46:13.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673142' 2026-03-08T23:46:13.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:14.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673143 -lt 42949673142 2026-03-08T23:46:14.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:14.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542320 2026-03-08T23:46:14.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:14.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:14.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542320 2026-03-08T23:46:14.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:14.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542320 2026-03-08T23:46:14.058 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542320 2026-03-08T23:46:14.058 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542320' 2026-03-08T23:46:14.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:14.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542320 -lt 60129542320 2026-03-08T23:46:14.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:14.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:14.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:14.388 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:14.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:14.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:14.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:14.402 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:14.402 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:14.571 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:14.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836669 2026-03-08T23:46:14.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836669 2026-03-08T23:46:14.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836669' 2026-03-08T23:46:14.662 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:14.662 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:14.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673145 2026-03-08T23:46:14.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673145 2026-03-08T23:46:14.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836669 1-42949673145' 2026-03-08T23:46:14.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:14.745 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:14.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542322 2026-03-08T23:46:14.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542322 2026-03-08T23:46:14.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836669 1-42949673145 2-60129542322' 2026-03-08T23:46:14.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:14.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836669 2026-03-08T23:46:14.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:14.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:14.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:14.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836669 2026-03-08T23:46:14.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836669 2026-03-08T23:46:14.828 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836669 2026-03-08T23:46:14.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836669' 2026-03-08T23:46:14.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:15.001 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836667 -lt 21474836669 2026-03-08T23:46:15.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:16.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:16.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:16.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836670 -lt 21474836669 2026-03-08T23:46:16.193 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:16.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673145 2026-03-08T23:46:16.193 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:16.194 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:16.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673145 2026-03-08T23:46:16.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:16.196 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673145 2026-03-08T23:46:16.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673145 2026-03-08T23:46:16.196 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673145' 2026-03-08T23:46:16.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:16.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673146 -lt 42949673145 2026-03-08T23:46:16.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:16.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542322 2026-03-08T23:46:16.379 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:16.381 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:16.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542322 2026-03-08T23:46:16.381 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:16.382 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542322 2026-03-08T23:46:16.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542322 2026-03-08T23:46:16.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542322' 2026-03-08T23:46:16.383 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:16.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542323 -lt 60129542322 2026-03-08T23:46:16.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:16.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:16.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:16.737 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:16.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:16.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:16.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:16.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:16.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:16.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:17.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836673 2026-03-08T23:46:17.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836673 2026-03-08T23:46:17.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673' 2026-03-08T23:46:17.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:17.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673148 2026-03-08T23:46:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673148 2026-03-08T23:46:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673 1-42949673148' 2026-03-08T23:46:17.120 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:17.120 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542326 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542326 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836673 1-42949673148 2-60129542326' 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836673 2026-03-08T23:46:17.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:17.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:17.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836673 2026-03-08T23:46:17.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:17.204 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836673 2026-03-08T23:46:17.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836673 2026-03-08T23:46:17.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836673' 2026-03-08T23:46:17.204 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:17.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836670 -lt 21474836673 2026-03-08T23:46:17.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:18.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:18.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:18.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836673 -lt 21474836673 2026-03-08T23:46:18.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:18.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673148 2026-03-08T23:46:18.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:18.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:18.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673148 2026-03-08T23:46:18.558 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:18.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673148 2026-03-08T23:46:18.559 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673148 2026-03-08T23:46:18.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673148' 2026-03-08T23:46:18.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:18.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673149 -lt 42949673148 2026-03-08T23:46:18.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:18.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542326 2026-03-08T23:46:18.732 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:18.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:18.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542326 2026-03-08T23:46:18.734 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:18.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542326 2026-03-08T23:46:18.735 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542326 2026-03-08T23:46:18.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542326' 2026-03-08T23:46:18.735 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:18.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542326 -lt 60129542326 2026-03-08T23:46:18.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:18.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:18.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:19.066 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:19.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:19.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:19.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:19.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:19.080 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:19.251 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:19.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836676 2026-03-08T23:46:19.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836676 2026-03-08T23:46:19.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676' 2026-03-08T23:46:19.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:19.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:19.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673151 2026-03-08T23:46:19.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673151 2026-03-08T23:46:19.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676 1-42949673151' 2026-03-08T23:46:19.416 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:19.417 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:19.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542329 2026-03-08T23:46:19.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542329 2026-03-08T23:46:19.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836676 1-42949673151 2-60129542329' 2026-03-08T23:46:19.499 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:19.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836676 2026-03-08T23:46:19.500 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:19.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:19.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:19.501 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836676 2026-03-08T23:46:19.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836676 2026-03-08T23:46:19.502 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836676 2026-03-08T23:46:19.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836676' 2026-03-08T23:46:19.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836673 -lt 21474836676 2026-03-08T23:46:19.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:20.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:20.677 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:20.858 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836676 -lt 21474836676 2026-03-08T23:46:20.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:20.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673151 2026-03-08T23:46:20.859 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:20.860 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:20.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673151 2026-03-08T23:46:20.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:20.861 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673151 2026-03-08T23:46:20.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673151 2026-03-08T23:46:20.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673151' 2026-03-08T23:46:20.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:21.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673152 -lt 42949673151 2026-03-08T23:46:21.040 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:21.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542329 2026-03-08T23:46:21.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:21.042 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:21.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542329 2026-03-08T23:46:21.042 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:21.043 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542329 2026-03-08T23:46:21.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542329 2026-03-08T23:46:21.043 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542329' 2026-03-08T23:46:21.043 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:21.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542329 -lt 60129542329 2026-03-08T23:46:21.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:21.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:21.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:21.379 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:21.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:21.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:21.565 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:21.643 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836679 2026-03-08T23:46:21.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836679 2026-03-08T23:46:21.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679' 2026-03-08T23:46:21.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:21.644 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:21.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673155 2026-03-08T23:46:21.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673155 2026-03-08T23:46:21.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679 1-42949673155' 2026-03-08T23:46:21.727 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:21.727 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:21.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542332 2026-03-08T23:46:21.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542332 2026-03-08T23:46:21.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836679 1-42949673155 2-60129542332' 2026-03-08T23:46:21.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:21.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836679 2026-03-08T23:46:21.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:21.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:21.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836679 2026-03-08T23:46:21.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:21.825 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836679 2026-03-08T23:46:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836679 2026-03-08T23:46:21.825 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836679' 2026-03-08T23:46:21.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:21.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836679 -lt 21474836679 2026-03-08T23:46:21.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:21.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673155 2026-03-08T23:46:21.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:22.000 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:22.000 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673155 2026-03-08T23:46:22.001 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:22.002 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673155 2026-03-08T23:46:22.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673155 2026-03-08T23:46:22.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673155' 2026-03-08T23:46:22.002 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:22.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673155 -lt 42949673155 2026-03-08T23:46:22.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:22.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542332 2026-03-08T23:46:22.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:22.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:22.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542332 2026-03-08T23:46:22.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:22.183 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542332 2026-03-08T23:46:22.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542332 2026-03-08T23:46:22.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542332' 2026-03-08T23:46:22.183 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:22.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542331 -lt 60129542332 2026-03-08T23:46:22.357 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:23.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:23.358 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:23.542 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542331 -lt 60129542332 2026-03-08T23:46:23.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:24.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:46:24.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:24.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542334 -lt 60129542332 2026-03-08T23:46:24.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:24.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:24.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:24.887 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:24.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:24.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:24.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:24.900 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:24.900 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:25.069 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:25.070 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:25.070 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:25.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:25.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:25.070 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:25.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836684 2026-03-08T23:46:25.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836684 2026-03-08T23:46:25.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836684' 2026-03-08T23:46:25.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:25.151 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:25.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673159 2026-03-08T23:46:25.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673159 2026-03-08T23:46:25.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836684 1-42949673159' 2026-03-08T23:46:25.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:25.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542337 2026-03-08T23:46:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542337 2026-03-08T23:46:25.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836684 1-42949673159 2-60129542337' 2026-03-08T23:46:25.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:25.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836684 2026-03-08T23:46:25.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:25.315 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:25.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836684 2026-03-08T23:46:25.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:25.317 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836684 2026-03-08T23:46:25.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836684 2026-03-08T23:46:25.317 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836684' 2026-03-08T23:46:25.317 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:25.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836681 -lt 21474836684 2026-03-08T23:46:25.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:26.494 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:26.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:26.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836684 -lt 21474836684 2026-03-08T23:46:26.663 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:26.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673159 2026-03-08T23:46:26.663 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:26.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:26.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673159 2026-03-08T23:46:26.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:26.666 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673159 2026-03-08T23:46:26.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673159 2026-03-08T23:46:26.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673159' 2026-03-08T23:46:26.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:26.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673160 -lt 42949673159 2026-03-08T23:46:26.841 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:26.841 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542337 2026-03-08T23:46:26.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:26.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:26.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542337 2026-03-08T23:46:26.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:26.845 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542337 2026-03-08T23:46:26.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542337 2026-03-08T23:46:26.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542337' 2026-03-08T23:46:26.845 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542337 -lt 60129542337 2026-03-08T23:46:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:27.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:27.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:27.175 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:27.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:27.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:27.187 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:27.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:27.188 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:27.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:27.349 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:27.349 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:27.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:27.349 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:27.350 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:27.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836687 2026-03-08T23:46:27.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836687 2026-03-08T23:46:27.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687' 2026-03-08T23:46:27.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:27.432 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673162 2026-03-08T23:46:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673162 2026-03-08T23:46:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687 1-42949673162' 2026-03-08T23:46:27.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:27.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:27.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542340 2026-03-08T23:46:27.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542340 2026-03-08T23:46:27.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836687 1-42949673162 2-60129542340' 2026-03-08T23:46:27.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:27.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836687 2026-03-08T23:46:27.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:27.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:27.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836687 2026-03-08T23:46:27.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:27.592 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836687 2026-03-08T23:46:27.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836687 2026-03-08T23:46:27.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836687' 2026-03-08T23:46:27.592 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:27.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836684 -lt 21474836687 2026-03-08T23:46:27.762 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:28.763 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:28.763 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:28.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836687 -lt 21474836687 2026-03-08T23:46:28.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:28.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673162 2026-03-08T23:46:28.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:28.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:28.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673162 2026-03-08T23:46:28.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:28.936 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673162 2026-03-08T23:46:28.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673162 2026-03-08T23:46:28.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673162' 2026-03-08T23:46:28.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:29.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673163 -lt 42949673162 2026-03-08T23:46:29.106 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:29.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542340 2026-03-08T23:46:29.106 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:29.107 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:29.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542340 2026-03-08T23:46:29.108 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:29.109 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542340 2026-03-08T23:46:29.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542340 2026-03-08T23:46:29.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542340' 2026-03-08T23:46:29.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:29.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542340 -lt 60129542340 2026-03-08T23:46:29.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:29.280 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:29.281 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:29.433 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:29.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:380: _scrub_abort: continue 2026-03-08T23:46:29.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:375: _scrub_abort: for i in $(seq 0 200) 2026-03-08T23:46:29.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:377: _scrub_abort: flush_pg_stats 2026-03-08T23:46:29.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:46:29.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:29.614 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:46:29.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836690 2026-03-08T23:46:29.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836690 2026-03-08T23:46:29.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690' 2026-03-08T23:46:29.691 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:29.692 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:46:29.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673166 2026-03-08T23:46:29.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673166 2026-03-08T23:46:29.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690 1-42949673166' 2026-03-08T23:46:29.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:46:29.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:46:29.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=60129542343 2026-03-08T23:46:29.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 60129542343 2026-03-08T23:46:29.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836690 1-42949673166 2-60129542343' 2026-03-08T23:46:29.859 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:29.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836690 2026-03-08T23:46:29.860 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:29.861 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:46:29.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836690 2026-03-08T23:46:29.861 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:29.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836690 2026-03-08T23:46:29.862 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836690 2026-03-08T23:46:29.862 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836690' 2026-03-08T23:46:29.862 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:46:30.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836690 -lt 21474836690 2026-03-08T23:46:30.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:30.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673166 2026-03-08T23:46:30.029 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:30.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:46:30.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:30.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673166 2026-03-08T23:46:30.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673166 2026-03-08T23:46:30.032 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673166 2026-03-08T23:46:30.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673166' 2026-03-08T23:46:30.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:30.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673165 -lt 42949673166 2026-03-08T23:46:30.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:31.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:46:31.205 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:31.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673165 -lt 42949673166 2026-03-08T23:46:31.384 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:46:32.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:46:32.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:46:32.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673168 -lt 42949673166 2026-03-08T23:46:32.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:46:32.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-60129542343 2026-03-08T23:46:32.557 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:46:32.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:46:32.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-60129542343 2026-03-08T23:46:32.559 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:46:32.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=60129542343 2026-03-08T23:46:32.560 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 60129542343 2026-03-08T23:46:32.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 60129542343' 2026-03-08T23:46:32.560 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:46:32.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 60129542345 -lt 60129542343 2026-03-08T23:46:32.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: ceph pg dump pgs 2026-03-08T23:46:32.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep -q scrubbing 2026-03-08T23:46:32.737 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:378: _scrub_abort: grep '^1.0' 2026-03-08T23:46:32.896 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:46:32.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:383: _scrub_abort: break 2026-03-08T23:46:32.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:385: _scrub_abort: set +o pipefail 2026-03-08T23:46:32.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:387: _scrub_abort: sleep 5 2026-03-08T23:46:37.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:389: _scrub_abort: grep 'noscrub set, aborting' td/osd-scrub-test/osd.1.log 2026-03-08T23:46:37.920 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:44:35.451+0000 7f81f3457640 10 osd.1 pg_epoch: 19 pg[1.0( v 18'1000 (0'0,18'1000] local-lis/les=15/17 n=1000 ec=15/15 lis/c=15/15 les/c/f=17/17/0 sis=15) [1,0,2] r=0 lpr=15 crt=18'1000 lcod 18'999 mlcod 18'999 active+clean+scrubbing [ 1.0: ] ] scrubber: noscrub set, aborting 2026-03-08T23:46:37.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:395: _scrub_abort: get_last_scrub_stamp 1.0 2026-03-08T23:46:37.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:37.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:37.920 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:37.921 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:38.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:395: _scrub_abort: local last_scrub=2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:38.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:396: _scrub_abort: ceph config set osd osd_scrub_sleep 0.1 2026-03-08T23:46:38.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:398: _scrub_abort: ceph osd unset noscrub 2026-03-08T23:46:38.452 INFO:tasks.workunit.client.0.vm03.stderr:noscrub is unset 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:399: _scrub_abort: '[' scrub = deep-scrub ']' 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:403: _scrub_abort: TIMEOUT=500 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:404: _scrub_abort: wait_for_scrub 1.0 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:46:38.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:38.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:38.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:38.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:38.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:38.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:38.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:38.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:39.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:39.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:39.822 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:40.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:40.998 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:41.999 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:42.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:42.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:43.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:43.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:43.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:43.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:43.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:43.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:43.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:43.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:43.342 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:44.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:44.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:44.526 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:45.527 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:45.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:45.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:46.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:46.689 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:46.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:46.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:46.689 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:46.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:46.690 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:46.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:46.855 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:47.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:47.856 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:47.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:47.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:47.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:47.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:47.857 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:48.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:48.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:49.032 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:49.033 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:49.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:49.204 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:50.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:50.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:50.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:51.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:51.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:51.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:52.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:52.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:52.728 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:53.730 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:53.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:53.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:54.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:54.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:54.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:54.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:54.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:54.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:54.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:55.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:55.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:56.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:56.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:56.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:56.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:56.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:56.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:56.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:56.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:56.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:57.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:57.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:57.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:58.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:58.424 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:58.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:58.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:58.424 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:58.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:58.425 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:58.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:58.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:46:59.602 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:46:59.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:46:59.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:46:59.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:00.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:00.770 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:00.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:00.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:00.770 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:00.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:00.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:00.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:00.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:01.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:01.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:01.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:01.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:01.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:01.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:01.939 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:02.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:02.108 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:03.109 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:03.109 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:03.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:03.110 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:03.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:03.286 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:04.287 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:04.288 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:04.465 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:04.466 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:05.467 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:05.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:05.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:06.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:06.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:06.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:07.827 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:07.828 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:08.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:08.002 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:09.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:09.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:09.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:09.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:09.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:09.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:09.004 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:09.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:09.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:10.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:10.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:10.340 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:11.341 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:11.342 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:11.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:11.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:12.511 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:12.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:12.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:13.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:13.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:13.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:13.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:14.854 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:15.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:15.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:16.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:16.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:16.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:17.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:17.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:17.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:18.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:18.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:18.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:18.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:19.595 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:19.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:19.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:20.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:20.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:20.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:20.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:20.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:20.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:20.777 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:20.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:20.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:21.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:22.125 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:22.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:23.129 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:23.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:23.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:24.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:24.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:24.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:24.485 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:25.486 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:25.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:25.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:26.659 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:26.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:26.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:27.844 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:28.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:28.023 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:29.024 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:29.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:29.205 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:30.206 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:30.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:30.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:31.387 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:31.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:31.567 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:32.568 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:32.569 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:32.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:32.741 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:33.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:33.742 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:33.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:33.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:33.742 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:33.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:33.743 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:33.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:33.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:34.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:34.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:34.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:34.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:34.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:34.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:34.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:35.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:35.086 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:36.087 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:36.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:36.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:37.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:37.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:37.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:38.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:38.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:38.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-01T23:42:49.952081+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:38.605 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:47:39.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:47:39.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:47:39.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:47:39.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:47:39.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:47:39.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:47:39.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:47:38.153735+0000 '>' 2026-03-01T23:42:49.952081+0000 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:405: _scrub_abort: perf_counters td/osd-scrub-test 3 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:47:39.779 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:47:39.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:47:39.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:39.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:47:39.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.862 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.863 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.865 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:39.874 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:47:39.875 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.944 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:39.945 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 2, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 2, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 55.064321935, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 55.064321935 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 1, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 5.004024467, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 5.004024467 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 11, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:39.946 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 2, 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 2, 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.008000043, 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.004000021 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.955 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:47:39.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:40.028 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:40.029 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:47:40.030 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:47:40.031 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:47:40.032 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:47:40.040 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_scrub_abort ------------------ 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_scrub_abort ------------------' 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:47:40.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:47:40.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:47:40.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:47:40.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:47:40.167 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:47:40.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:47:40.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:47:40.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:47:40.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:47:40.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:47:40.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:47:40.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:47:40.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:47:40.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:47:40.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:47:40.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:47:40.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_scrub_abort ------------------ 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_scrub_extended_sleep ------------------- 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_scrub_abort ------------------' 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_scrub_extended_sleep -------------------' 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:47:40.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:47:40.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:47:40.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:47:40.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:47:40.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:47:40.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:47:40.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:47:40.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:47:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:47:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:47:40.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:47:40.233 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:47:40.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:47:40.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:47:40.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:47:40.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:47:40.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:47:40.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:47:40.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:47:40.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:47:40.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:47:40.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:47:40.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_scrub_extended_sleep ----------------------- 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_scrub_extended_sleep -----------------------' 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_scrub_extended_sleep td/osd-scrub-test 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:209: TEST_scrub_extended_sleep: local dir=td/osd-scrub-test 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:210: TEST_scrub_extended_sleep: local poolname=test 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:211: TEST_scrub_extended_sleep: local OSDS=3 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:212: TEST_scrub_extended_sleep: local objects=15 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:214: TEST_scrub_extended_sleep: TESTDATA=testdata.475827 2026-03-08T23:47:40.240 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:216: TEST_scrub_extended_sleep: date +%w 2026-03-08T23:47:40.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:216: TEST_scrub_extended_sleep: DAY=0 2026-03-08T23:47:40.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:218: TEST_scrub_extended_sleep: '[' 0 -ge 4 ']' 2026-03-08T23:47:40.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:223: TEST_scrub_extended_sleep: expr 0 + 2 2026-03-08T23:47:40.243 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:223: TEST_scrub_extended_sleep: DAY_START=2 2026-03-08T23:47:40.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:224: TEST_scrub_extended_sleep: expr 0 + 3 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:224: TEST_scrub_extended_sleep: DAY_END=3 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:226: TEST_scrub_extended_sleep: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:47:40.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:47:40.267 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:40.268 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:47:40.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:47:40.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:47:40.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:47:40.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:47:40.292 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:47:40.293 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:47:40.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:47:40.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:47:40.293 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:47:40.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:47:40.356 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:47:40.357 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:47:40.357 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.357 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.357 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:47:40.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:47:40.360 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:227: TEST_scrub_extended_sleep: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:47:40.433 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:47:40.543 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:40.544 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:47:40.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:47:40.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:229: TEST_scrub_extended_sleep: local 'ceph_osd_args=--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 ' 2026-03-08T23:47:40.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:230: TEST_scrub_extended_sleep: ceph_osd_args+='--osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 ' 2026-03-08T23:47:40.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:231: TEST_scrub_extended_sleep: ceph_osd_args+='--osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 ' 2026-03-08T23:47:40.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:232: TEST_scrub_extended_sleep: ceph_osd_args+='--osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 ' 2026-03-08T23:47:40.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:233: TEST_scrub_extended_sleep: ceph_osd_args+='--osd_op_queue=wpq --osd_scrub_end_week_day=3 ' 2026-03-08T23:47:40.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:234: TEST_scrub_extended_sleep: ceph_osd_args+=--bluestore_cache_autotune=false 2026-03-08T23:47:40.573 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:236: TEST_scrub_extended_sleep: expr 3 - 1 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:236: TEST_scrub_extended_sleep: seq 0 2 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:236: TEST_scrub_extended_sleep: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:238: TEST_scrub_extended_sleep: run_osd td/osd-scrub-test 0 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:47:40.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false' 2026-03-08T23:47:40.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:47:40.577 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:47:40.578 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 ddd0386e-a24f-449e-a370-31a48d532508 2026-03-08T23:47:40.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=ddd0386e-a24f-449e-a370-31a48d532508 2026-03-08T23:47:40.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 ddd0386e-a24f-449e-a370-31a48d532508' 2026-03-08T23:47:40.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:47:40.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCcCq5pLvvMIxAAoo2Jow043GBquXlVHq8m4A== 2026-03-08T23:47:40.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCcCq5pLvvMIxAAoo2Jow043GBquXlVHq8m4A=="}' 2026-03-08T23:47:40.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new ddd0386e-a24f-449e-a370-31a48d532508 -i td/osd-scrub-test/0/new.json 2026-03-08T23:47:40.702 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:47:40.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:47:40.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false --mkfs --key AQCcCq5pLvvMIxAAoo2Jow043GBquXlVHq8m4A== --osd-uuid ddd0386e-a24f-449e-a370-31a48d532508 2026-03-08T23:47:40.738 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:40.744+0000 7fa8cb41b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:40.743 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:40.748+0000 7fa8cb41b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:40.744 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:40.748+0000 7fa8cb41b8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:40.744 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:40.748+0000 7fa8cb41b8c0 -1 bdev(0x56133a51ac00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:47:40.745 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:40.748+0000 7fa8cb41b8c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:47:43.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:47:43.361 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:47:43.362 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:47:43.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:47:43.362 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:47:43.583 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:47:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:47:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:47:43.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:47:43.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:47:43.601 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:43.604+0000 7f8b13abd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:43.609 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:43.616+0000 7f8b13abd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:43.610 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:43.616+0000 7f8b13abd8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:43.771 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:47:43.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:43.789 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:47:43.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:44.814 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:44.820+0000 7f8b13abd8c0 -1 Falling back to public interface 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:44.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:47:45.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:45.787 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:45.792+0000 7f8b13abd8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:47:46.136 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:47:46.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:46.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:46.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:47:46.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:46.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:47:46.312 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/3774982002,v1:127.0.0.1:6803/3774982002] [v2:127.0.0.1:6804/3774982002,v1:127.0.0.1:6805/3774982002] exists,up ddd0386e-a24f-449e-a370-31a48d532508 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:236: TEST_scrub_extended_sleep: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:238: TEST_scrub_extended_sleep: run_osd td/osd-scrub-test 1 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:47:46.313 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false' 2026-03-08T23:47:46.314 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:47:46.315 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:47:46.316 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 17a8e4b9-a99f-42ee-a12a-268a683b416b 2026-03-08T23:47:46.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=17a8e4b9-a99f-42ee-a12a-268a683b416b 2026-03-08T23:47:46.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 17a8e4b9-a99f-42ee-a12a-268a683b416b' 2026-03-08T23:47:46.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:47:46.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCiCq5pDJcsFBAAIqTRFC+Enm8FcA4x2O8kLw== 2026-03-08T23:47:46.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCiCq5pDJcsFBAAIqTRFC+Enm8FcA4x2O8kLw=="}' 2026-03-08T23:47:46.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 17a8e4b9-a99f-42ee-a12a-268a683b416b -i td/osd-scrub-test/1/new.json 2026-03-08T23:47:46.493 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:47:46.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:47:46.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false --mkfs --key AQCiCq5pDJcsFBAAIqTRFC+Enm8FcA4x2O8kLw== --osd-uuid 17a8e4b9-a99f-42ee-a12a-268a683b416b 2026-03-08T23:47:46.527 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:46.532+0000 7f961e63a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:46.529 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:46.536+0000 7f961e63a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:46.530 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:46.536+0000 7f961e63a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:46.530 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:46.536+0000 7f961e63a8c0 -1 bdev(0x55cd478e7c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:47:46.530 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:46.536+0000 7f961e63a8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:47:49.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:47:49.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:47:49.008 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:47:49.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:47:49.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:47:49.225 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:47:49.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:47:49.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:49.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:47:49.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:47:49.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:47:49.243 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:49.248+0000 7fe2425c68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:49.244 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:49.248+0000 7fe2425c68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:49.245 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:49.248+0000 7fe2425c68c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:49.414 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:49.415 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:47:49.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:49.962 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:49.968+0000 7fe2425c68c0 -1 Falling back to public interface 2026-03-08T23:47:50.595 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:47:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:47:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:50.596 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:47:50.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:51.097 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:51.104+0000 7fe2425c68c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:51.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:47:51.952 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/1117558623,v1:127.0.0.1:6811/1117558623] [v2:127.0.0.1:6812/1117558623,v1:127.0.0.1:6813/1117558623] exists,up 17a8e4b9-a99f-42ee-a12a-268a683b416b 2026-03-08T23:47:51.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:47:51.952 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:236: TEST_scrub_extended_sleep: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:238: TEST_scrub_extended_sleep: run_osd td/osd-scrub-test 2 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:47:51.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false' 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:47:51.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:47:51.955 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 b7192d63-f83e-4bb5-8912-95b908d968d4 2026-03-08T23:47:51.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=b7192d63-f83e-4bb5-8912-95b908d968d4 2026-03-08T23:47:51.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 b7192d63-f83e-4bb5-8912-95b908d968d4' 2026-03-08T23:47:51.955 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:47:51.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQCnCq5p5NYvOhAAAb9fi2m7Kz93rcoOLxk81w== 2026-03-08T23:47:51.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQCnCq5p5NYvOhAAAb9fi2m7Kz93rcoOLxk81w=="}' 2026-03-08T23:47:51.966 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new b7192d63-f83e-4bb5-8912-95b908d968d4 -i td/osd-scrub-test/2/new.json 2026-03-08T23:47:52.142 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:47:52.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:47:52.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false --mkfs --key AQCnCq5p5NYvOhAAAb9fi2m7Kz93rcoOLxk81w== --osd-uuid b7192d63-f83e-4bb5-8912-95b908d968d4 2026-03-08T23:47:52.178 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:52.184+0000 7f0cbf8898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:52.180 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:52.184+0000 7f0cbf8898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:52.181 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:52.188+0000 7f0cbf8898c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:52.181 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:52.188+0000 7f0cbf8898c0 -1 bdev(0x5567ba2e7c00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:47:52.181 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:52.188+0000 7f0cbf8898c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:47:54.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:47:54.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:47:54.440 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:47:54.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:47:54.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:47:54.645 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:47:54.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:47:54.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --osd_scrub_sleep=0 --osd_scrub_extended_sleep=20 --osd_scrub_begin_week_day=2 --osd_op_queue=wpq --osd_scrub_end_week_day=3 --bluestore_cache_autotune=false 2026-03-08T23:47:54.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:47:54.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:47:54.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:47:54.663 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:54.668+0000 7ffb42f598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:54.670 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:54.676+0000 7ffb42f598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:54.671 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:54.676+0000 7ffb42f598c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:54.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:47:55.006 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:56.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:56.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:56.007 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:47:56.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:47:56.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:56.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:47:56.126 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:56.132+0000 7ffb42f598c0 -1 Falling back to public interface 2026-03-08T23:47:56.183 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:57.098 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:47:57.104+0000 7ffb42f598c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:47:57.184 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:47:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:47:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:57.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:47:57.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:47:58.360 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/2300563966,v1:127.0.0.1:6819/2300563966] [v2:127.0.0.1:6820/2300563966,v1:127.0.0.1:6821/2300563966] exists,up b7192d63-f83e-4bb5-8912-95b908d968d4 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:242: TEST_scrub_extended_sleep: create_pool test 1 1 2026-03-08T23:47:58.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:47:58.746 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:47:58.767 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:243: TEST_scrub_extended_sleep: wait_for_clean 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:47:59.768 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:47:59.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:47:59.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:47:59.769 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:47:59.831 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:00.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:48:00.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836495 2026-03-08T23:48:00.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836495 2026-03-08T23:48:00.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495' 2026-03-08T23:48:00.088 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:00.088 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:48:00.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672969 2026-03-08T23:48:00.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672969 2026-03-08T23:48:00.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672969' 2026-03-08T23:48:00.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:00.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836495 1-42949672969 2-64424509443' 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836495 2026-03-08T23:48:00.245 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:00.246 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:48:00.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836495 2026-03-08T23:48:00.247 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:00.248 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836495 2026-03-08T23:48:00.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836495 2026-03-08T23:48:00.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836495' 2026-03-08T23:48:00.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:48:00.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836493 -lt 21474836495 2026-03-08T23:48:00.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:48:01.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:48:01.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:48:01.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836495 2026-03-08T23:48:01.604 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:01.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672969 2026-03-08T23:48:01.605 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:01.606 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:48:01.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672969 2026-03-08T23:48:01.606 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:01.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672969 2026-03-08T23:48:01.607 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672969 2026-03-08T23:48:01.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672969' 2026-03-08T23:48:01.607 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:48:01.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672971 -lt 42949672969 2026-03-08T23:48:01.781 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:01.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T23:48:01.781 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:01.782 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:48:01.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T23:48:01.782 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:01.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T23:48:01.783 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T23:48:01.783 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T23:48:01.783 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:48:01.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509443 2026-03-08T23:48:01.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:48:01.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:48:01.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:48:02.166 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:48:02.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:48:02.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:48:02.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:48:02.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:246: TEST_scrub_extended_sleep: get_pg test SOMETHING 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=test 2026-03-08T23:48:02.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:48:02.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map test SOMETHING 2026-03-08T23:48:02.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:246: TEST_scrub_extended_sleep: local pgid=1.0 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:247: TEST_scrub_extended_sleep: get_primary test SOMETHING 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test SOMETHING 2026-03-08T23:48:02.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:247: TEST_scrub_extended_sleep: local primary=1 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:248: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:02.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:03.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:248: TEST_scrub_extended_sleep: local last_scrub=2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:03.080 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:249: TEST_scrub_extended_sleep: ceph tell 1.0 schedule-scrub 2026-03-08T23:48:03.156 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:48:03.156 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:48:03.156 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:48:03.156 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-01T23:46:23.167925+0000" 2026-03-08T23:48:03.156 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:48:03.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:252: TEST_scrub_extended_sleep: PASSED=false 2026-03-08T23:48:03.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:253: TEST_scrub_extended_sleep: (( i=0 )) 2026-03-08T23:48:03.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:253: TEST_scrub_extended_sleep: (( i < 15 )) 2026-03-08T23:48:03.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:254: TEST_scrub_extended_sleep: grep -q 'scrub state.*, sleeping' td/osd-scrub-test/osd.1.log 2026-03-08T23:48:03.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:259: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:04.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:253: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:04.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:253: TEST_scrub_extended_sleep: (( i < 15 )) 2026-03-08T23:48:04.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:254: TEST_scrub_extended_sleep: grep -q 'scrub state.*, sleeping' td/osd-scrub-test/osd.1.log 2026-03-08T23:48:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:256: TEST_scrub_extended_sleep: PASSED=true 2026-03-08T23:48:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:257: TEST_scrub_extended_sleep: break 2026-03-08T23:48:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:263: TEST_scrub_extended_sleep: '[' true = false ']' 2026-03-08T23:48:04.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:269: TEST_scrub_extended_sleep: ceph tell osd.1 config set osd_scrub_begin_week_day 0 2026-03-08T23:48:04.244 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:48:04.244 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_begin_week_day = '' (not observed, change may require restart) " 2026-03-08T23:48:04.245 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:48:04.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:270: TEST_scrub_extended_sleep: ceph tell osd.1 config set osd_scrub_end_week_day 0 2026-03-08T23:48:04.328 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:48:04.328 INFO:tasks.workunit.client.0.vm03.stdout: "success": "osd_scrub_end_week_day = '' (not observed, change may require restart) " 2026-03-08T23:48:04.328 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:48:04.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:274: TEST_scrub_extended_sleep: count=0 2026-03-08T23:48:04.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:275: TEST_scrub_extended_sleep: PASSED=false 2026-03-08T23:48:04.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i=0 )) 2026-03-08T23:48:04.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:04.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 0 + 1 2026-03-08T23:48:04.339 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=1 2026-03-08T23:48:04.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:04.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:04.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:04.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:04.340 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:04.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:04.506 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:05.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:05.507 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:05.507 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 1 + 1 2026-03-08T23:48:05.508 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=2 2026-03-08T23:48:05.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:05.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:05.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:05.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:05.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:05.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:05.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:06.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:06.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:06.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 2 + 1 2026-03-08T23:48:06.681 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=3 2026-03-08T23:48:06.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:06.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:06.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:06.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:06.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:06.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:06.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:07.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:07.852 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:07.852 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 3 + 1 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=4 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:07.853 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:08.029 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:09.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:09.030 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:09.030 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 4 + 1 2026-03-08T23:48:09.031 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=5 2026-03-08T23:48:09.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:09.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:09.031 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:09.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:09.032 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:09.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:09.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:10.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:10.201 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:10.202 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 5 + 1 2026-03-08T23:48:10.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=6 2026-03-08T23:48:10.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:10.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:10.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:10.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:10.203 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:10.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:10.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:11.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:11.376 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:11.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 6 + 1 2026-03-08T23:48:11.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=7 2026-03-08T23:48:11.377 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:11.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:11.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:11.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:11.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:11.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:11.543 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:12.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:12.544 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:12.545 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 7 + 1 2026-03-08T23:48:12.545 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=8 2026-03-08T23:48:12.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:12.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:12.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:12.546 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:12.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:12.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:12.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:13.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:13.715 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:13.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 8 + 1 2026-03-08T23:48:13.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=9 2026-03-08T23:48:13.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:13.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:13.716 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:13.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:13.717 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:13.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:13.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:14.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:14.887 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:14.887 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 9 + 1 2026-03-08T23:48:14.888 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=10 2026-03-08T23:48:14.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:14.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:14.888 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:14.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:14.889 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:15.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:15.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:16.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:16.063 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:16.063 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 10 + 1 2026-03-08T23:48:16.064 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=11 2026-03-08T23:48:16.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:16.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:16.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:16.064 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:16.065 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:16.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:16.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:17.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:17.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:17.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 11 + 1 2026-03-08T23:48:17.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=12 2026-03-08T23:48:17.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:17.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:17.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:17.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:17.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:18.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:18.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:18.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 12 + 1 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=13 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:18.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:18.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:18.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:19.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:19.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:19.590 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 13 + 1 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=14 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:19.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:19.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:19.772 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:20.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:20.773 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 14 + 1 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=15 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:20.774 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:20.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:20.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:21.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:21.964 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:21.964 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 15 + 1 2026-03-08T23:48:21.965 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=16 2026-03-08T23:48:21.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:21.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:21.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:21.965 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:21.966 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:22.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:22.150 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:23.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:23.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 16 + 1 2026-03-08T23:48:23.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=17 2026-03-08T23:48:23.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:23.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:23.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:23.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:23.153 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:23.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-01T23:46:23.167925+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:23.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:287: TEST_scrub_extended_sleep: sleep 1 2026-03-08T23:48:24.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i++ )) 2026-03-08T23:48:24.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:276: TEST_scrub_extended_sleep: (( i < 25 )) 2026-03-08T23:48:24.323 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: expr 17 + 1 2026-03-08T23:48:24.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:277: TEST_scrub_extended_sleep: count=18 2026-03-08T23:48:24.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: get_last_scrub_stamp 1.0 2026-03-08T23:48:24.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:24.324 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:24.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:24.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_scrub_extended_sleep ------------------ 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:278: TEST_scrub_extended_sleep: test 2026-03-08T23:48:23.214774+0000 '>' 2026-03-08T23:47:58.756100+0000 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:280: TEST_scrub_extended_sleep: '[' 18 -lt 10 ']' 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:284: TEST_scrub_extended_sleep: PASSED=true 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:285: TEST_scrub_extended_sleep: break 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:291: TEST_scrub_extended_sleep: '[' true = false ']' 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_scrub_extended_sleep ------------------' 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:48:24.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:48:24.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:48:24.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:48:24.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:48:24.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:48:24.623 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:48:24.624 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:48:24.624 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:48:24.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:48:24.625 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:48:24.625 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:48:24.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:48:24.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:48:24.626 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:48:24.627 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:48:24.627 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:48:24.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:48:24.628 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:48:24.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:48:24.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.635 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_scrub_extended_sleep ------------------ 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_scrub_permit_time ------------------- 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_scrub_extended_sleep ------------------' 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_scrub_permit_time -------------------' 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:48:24.636 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:48:24.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:48:24.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:48:24.637 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:48:24.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:48:24.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:48:24.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:48:24.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:48:24.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:48:24.641 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:48:24.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:48:24.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:48:24.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:48:24.642 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:48:24.642 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:48:24.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:48:24.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:48:24.644 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:48:24.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:48:24.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.645 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.645 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:48:24.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:48:24.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:48:24.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:48:24.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:48:24.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:48:24.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_scrub_permit_time ----------------------- 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_scrub_permit_time -----------------------' 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_scrub_permit_time td/osd-scrub-test 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:419: TEST_scrub_permit_time: local dir=td/osd-scrub-test 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:420: TEST_scrub_permit_time: local poolname=test 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:421: TEST_scrub_permit_time: local OSDS=3 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:422: TEST_scrub_permit_time: local objects=15 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:424: TEST_scrub_permit_time: TESTDATA=testdata.475827 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:426: TEST_scrub_permit_time: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:48:24.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:48:24.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:48:24.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:48:24.675 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:48:24.676 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:48:24.676 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.676 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.676 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:24.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:48:24.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:48:24.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:48:24.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:48:24.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:48:24.708 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:48:24.709 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:48:24.713 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:48:24.713 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:48:24.713 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:48:24.713 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:48:24.713 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.714 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.714 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:48:24.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:48:24.715 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:48:24.795 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.796 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:48:24.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:48:24.804 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:427: TEST_scrub_permit_time: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:48:24.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:24.984 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:24.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:24.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:48:24.986 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:48:25.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:428: TEST_scrub_permit_time: sed 's/^0//' 2026-03-08T23:48:25.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:428: TEST_scrub_permit_time: date -d '2 hour ago' +%H 2026-03-08T23:48:25.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:428: TEST_scrub_permit_time: local scrub_begin_hour=21 2026-03-08T23:48:25.014 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:429: TEST_scrub_permit_time: date -d '1 hour ago' +%H 2026-03-08T23:48:25.015 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:429: TEST_scrub_permit_time: sed 's/^0//' 2026-03-08T23:48:25.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:429: TEST_scrub_permit_time: local scrub_end_hour=22 2026-03-08T23:48:25.016 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:430: TEST_scrub_permit_time: expr 3 - 1 2026-03-08T23:48:25.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:430: TEST_scrub_permit_time: seq 0 2 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:430: TEST_scrub_permit_time: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:432: TEST_scrub_permit_time: run_osd td/osd-scrub-test 0 --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:48:25.018 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22' 2026-03-08T23:48:25.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:48:25.020 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:48:25.021 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 2fc77811-789b-42c7-9696-58c2634a804d 2026-03-08T23:48:25.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=2fc77811-789b-42c7-9696-58c2634a804d 2026-03-08T23:48:25.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 2fc77811-789b-42c7-9696-58c2634a804d' 2026-03-08T23:48:25.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:48:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDJCq5phhGfAhAATMOiLf3fvbgRMehkG1UoWA== 2026-03-08T23:48:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDJCq5phhGfAhAATMOiLf3fvbgRMehkG1UoWA=="}' 2026-03-08T23:48:25.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 2fc77811-789b-42c7-9696-58c2634a804d -i td/osd-scrub-test/0/new.json 2026-03-08T23:48:25.151 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:48:25.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:48:25.163 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 --mkfs --key AQDJCq5phhGfAhAATMOiLf3fvbgRMehkG1UoWA== --osd-uuid 2fc77811-789b-42c7-9696-58c2634a804d 2026-03-08T23:48:25.180 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:25.184+0000 7fd39cc788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:25.187 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:25.192+0000 7fd39cc788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:25.194 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:25.192+0000 7fd39cc788c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:25.194 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:25.192+0000 7fd39cc788c0 -1 bdev(0x55749b2fcc00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:48:25.194 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:25.192+0000 7fd39cc788c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:48:27.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:48:27.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:48:27.448 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:48:27.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:48:27.448 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:48:27.550 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:48:27.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:48:27.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:48:27.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:48:27.557 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:27.558 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:48:27.604 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:27.608+0000 7f82eb46e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:27.610 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:27.616+0000 7f82eb46e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:27.620 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:27.624+0000 7f82eb46e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:27.716 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:48:27.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:48:27.716 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:48:27.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:28.569 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:28.576+0000 7f82eb46e8c0 -1 Falling back to public interface 2026-03-08T23:48:28.881 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:48:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:48:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:28.882 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:48:29.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:29.541 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:29.548+0000 7f82eb46e8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:48:30.292 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:31.293 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:48:31.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:31.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:31.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:48:31.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:31.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:48:31.477 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2088099867,v1:127.0.0.1:6803/2088099867] [v2:127.0.0.1:6804/2088099867,v1:127.0.0.1:6805/2088099867] exists,up 2fc77811-789b-42c7-9696-58c2634a804d 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:430: TEST_scrub_permit_time: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:432: TEST_scrub_permit_time: run_osd td/osd-scrub-test 1 --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:31.478 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22' 2026-03-08T23:48:31.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:48:31.480 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:48:31.481 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 8d18249e-c53a-4f79-af5f-40562c9cbba0 2026-03-08T23:48:31.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=8d18249e-c53a-4f79-af5f-40562c9cbba0 2026-03-08T23:48:31.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 8d18249e-c53a-4f79-af5f-40562c9cbba0' 2026-03-08T23:48:31.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:48:31.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDPCq5pOz78HRAA00NaclW/fK7gFtGrzmHYFA== 2026-03-08T23:48:31.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDPCq5pOz78HRAA00NaclW/fK7gFtGrzmHYFA=="}' 2026-03-08T23:48:31.492 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 8d18249e-c53a-4f79-af5f-40562c9cbba0 -i td/osd-scrub-test/1/new.json 2026-03-08T23:48:31.661 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:48:31.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:48:31.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 --mkfs --key AQDPCq5pOz78HRAA00NaclW/fK7gFtGrzmHYFA== --osd-uuid 8d18249e-c53a-4f79-af5f-40562c9cbba0 2026-03-08T23:48:31.694 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:31.700+0000 7fa3198638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:31.696 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:31.700+0000 7fa3198638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:31.696 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:31.704+0000 7fa3198638c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:31.697 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:31.704+0000 7fa3198638c0 -1 bdev(0x563b0be53c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:48:31.697 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:31.704+0000 7fa3198638c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:48:34.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:48:34.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:48:34.431 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:48:34.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:48:34.431 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:48:34.634 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:48:34.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:48:34.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:34.634 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:48:34.635 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:48:34.640 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:48:34.653 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:34.661+0000 7fb0c8f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:34.669 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:34.677+0000 7fb0c8f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:34.676 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:34.677+0000 7fb0c8f8f8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:34.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:48:34.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:35.865 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:35.873+0000 7fb0c8f8f8c0 -1 Falling back to public interface 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:35.977 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:48:36.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:36.842 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:36.849+0000 7fb0c8f8f8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:48:37.144 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:48:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:48:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:37.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:48:37.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:37.873 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:37.881+0000 7fb0c4748640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:38.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/3212087909,v1:127.0.0.1:6811/3212087909] [v2:127.0.0.1:6812/3212087909,v1:127.0.0.1:6813/3212087909] exists,up 8d18249e-c53a-4f79-af5f-40562c9cbba0 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:430: TEST_scrub_permit_time: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:432: TEST_scrub_permit_time: run_osd td/osd-scrub-test 2 --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:48:38.527 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:48:38.528 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22' 2026-03-08T23:48:38.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:48:38.530 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:48:38.531 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 a653829d-7bde-4d7f-86bf-39b50af870f2 2026-03-08T23:48:38.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a653829d-7bde-4d7f-86bf-39b50af870f2 2026-03-08T23:48:38.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 a653829d-7bde-4d7f-86bf-39b50af870f2' 2026-03-08T23:48:38.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:48:38.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQDWCq5pQd0iIRAAxPawLRKjlFJanswWxDUYfA== 2026-03-08T23:48:38.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQDWCq5pQd0iIRAAxPawLRKjlFJanswWxDUYfA=="}' 2026-03-08T23:48:38.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a653829d-7bde-4d7f-86bf-39b50af870f2 -i td/osd-scrub-test/2/new.json 2026-03-08T23:48:38.729 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:48:38.744 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:48:38.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 --mkfs --key AQDWCq5pQd0iIRAAxPawLRKjlFJanswWxDUYfA== --osd-uuid a653829d-7bde-4d7f-86bf-39b50af870f2 2026-03-08T23:48:38.761 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:38.769+0000 7fdef6a5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:38.763 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:38.769+0000 7fdef6a5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:38.763 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:38.769+0000 7fdef6a5e8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:38.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:38.769+0000 7fdef6a5e8c0 -1 bdev(0x55819a1f7c00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:48:38.764 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:38.769+0000 7fdef6a5e8c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:48:41.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:48:41.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:48:41.021 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:48:41.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:48:41.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:48:41.230 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:48:41.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:48:41.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --bluestore_cache_autotune=false --osd_deep_scrub_randomize_ratio=0.0 --osd_scrub_interval_randomize_ratio=0 --osd_scrub_begin_hour=21 --osd_scrub_end_hour=22 2026-03-08T23:48:41.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:48:41.231 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:48:41.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:48:41.248 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:41.253+0000 7f6be7e188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:41.248 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:41.253+0000 7f6be7e188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:41.250 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:41.257+0000 7f6be7e188c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:48:41.422 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:48:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:48:41.422 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:41.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:41.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:42.595 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:42.685 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:42.693+0000 7f6be7e188c0 -1 Falling back to public interface 2026-03-08T23:48:42.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:43.665 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:43.673+0000 7f6be7e188c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:43.775 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:44.025 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:45.026 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:48:45.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:45.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:45.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:48:45.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:45.027 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:45.300 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stdout:4 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 4 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:46.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:46.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:48:47.250 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:48:47.257+0000 7f6be35d1640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stdout:5 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 5 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:48:47.758 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:48:47.926 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3203628885,v1:127.0.0.1:6819/3203628885] [v2:127.0.0.1:6820/3203628885,v1:127.0.0.1:6821/3203628885] exists,up a653829d-7bde-4d7f-86bf-39b50af870f2 2026-03-08T23:48:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:48:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:48:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:48:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:440: TEST_scrub_permit_time: create_pool test 1 1 2026-03-08T23:48:47.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:48:48.171 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:48:48.185 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:441: TEST_scrub_permit_time: wait_for_clean 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:48:49.186 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:48:49.187 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:48:49.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:48:49.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:48:49.187 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:48:49.244 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:49.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:48:49.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836500 2026-03-08T23:48:49.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836500 2026-03-08T23:48:49.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500' 2026-03-08T23:48:49.491 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:49.491 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:48:49.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672973 2026-03-08T23:48:49.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672973 2026-03-08T23:48:49.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672973' 2026-03-08T23:48:49.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:48:49.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:48:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T23:48:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T23:48:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836500 1-42949672973 2-64424509443' 2026-03-08T23:48:49.664 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:49.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836500 2026-03-08T23:48:49.665 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:49.666 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:48:49.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836500 2026-03-08T23:48:49.666 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:49.667 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836500 2026-03-08T23:48:49.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836500 2026-03-08T23:48:49.667 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836500' 2026-03-08T23:48:49.667 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:48:49.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836500 -lt 21474836500 2026-03-08T23:48:49.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:49.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672973 2026-03-08T23:48:49.838 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:49.838 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:48:49.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672973 2026-03-08T23:48:49.839 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:49.840 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672973 2026-03-08T23:48:49.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672973 2026-03-08T23:48:49.840 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672973' 2026-03-08T23:48:49.840 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:48:50.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672973 -lt 42949672973 2026-03-08T23:48:50.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:48:50.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T23:48:50.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:48:50.007 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:48:50.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T23:48:50.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:48:50.008 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T23:48:50.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T23:48:50.009 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T23:48:50.009 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:48:50.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509443 2026-03-08T23:48:50.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:48:51.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:48:51.173 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:48:51.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509443 2026-03-08T23:48:51.350 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:48:52.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:48:52.351 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:48:52.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509443 2026-03-08T23:48:52.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:48:52.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:48:52.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:48:52.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:48:52.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:48:52.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:48:52.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:48:52.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:48:52.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:48:52.885 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:48:53.091 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:444: TEST_scrub_permit_time: get_pg test SOMETHING 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1093: get_pg: local poolname=test 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1094: get_pg: local objectname=SOMETHING 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: ceph --format json osd map test SOMETHING 2026-03-08T23:48:53.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1096: get_pg: jq -r .pgid 2026-03-08T23:48:53.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:444: TEST_scrub_permit_time: local pgid=1.0 2026-03-08T23:48:53.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:445: TEST_scrub_permit_time: get_primary test SOMETHING 2026-03-08T23:48:53.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:48:53.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=SOMETHING 2026-03-08T23:48:53.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test SOMETHING 2026-03-08T23:48:53.256 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:48:53.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:445: TEST_scrub_permit_time: local primary=1 2026-03-08T23:48:53.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:446: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:53.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:53.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:53.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:53.513 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:53.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:446: TEST_scrub_permit_time: local last_scrub=2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:53.673 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:451: TEST_scrub_permit_time: ceph tell 1.0 schedule-scrub 86400 2026-03-08T23:48:53.743 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:48:53.743 INFO:tasks.workunit.client.0.vm03.stdout: "deep": false, 2026-03-08T23:48:53.743 INFO:tasks.workunit.client.0.vm03.stdout: "must": false, 2026-03-08T23:48:53.743 INFO:tasks.workunit.client.0.vm03.stdout: "stamp": "2026-03-07T23:47:13.755266+0000" 2026-03-08T23:48:53.743 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i=0 )) 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:53.754 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:53.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-08T23:48:48.179184+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:53.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:54.915 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:55.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-08T23:48:48.179184+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:55.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:56.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:56.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:56.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:57.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:57.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:57.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:58.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:58.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:58.559 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:48:59.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:48:59.560 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:48:59.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:48:59.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:48:59.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:48:59.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:48:59.561 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:48:59.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:48:59.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:00.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:00.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:00.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:00.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:00.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:00.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:00.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:00.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:00.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:01.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:01.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:01.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:01.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:01.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:01.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:01.905 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:02.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:02.067 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:03.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:03.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:03.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:03.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:03.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:03.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:03.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:04.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:04.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:04.397 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:05.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:05.398 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:05.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:05.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:05.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:05.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:05.399 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:05.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:05.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:06.563 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:06.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:06.724 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:07.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:07.725 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:07.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:07.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:07.725 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:07.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:07.726 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:07.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:07.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:08.886 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:09.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:09.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:10.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:10.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:10.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:10.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:10.048 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:10.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:10.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:10.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:10.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:11.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:11.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:11.390 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:12.391 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:12.392 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:12.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:12.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:13.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:13.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:13.735 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:14.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:14.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:14.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:14.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:14.736 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:14.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:14.737 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:14.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:15.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:16.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:16.072 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:17.073 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:17.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:17.241 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:18.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:18.242 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:18.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:18.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:18.242 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:18.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:18.243 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:18.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:18.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:19.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:19.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:19.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:19.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:20.578 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:20.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:20.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:21.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:21.751 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:21.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:21.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:21.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:21.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:21.752 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:21.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:21.925 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:22.927 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:23.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:23.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:24.093 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:24.094 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:24.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:24.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:25.263 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:25.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:25.436 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:26.437 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:26.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: get_last_scrub_stamp 1.0 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:49:27.601 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:49:27.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:455: TEST_scrub_permit_time: test 2026-03-07T23:47:13.755266+0000 '>' 2026-03-08T23:48:48.179184+0000 2026-03-08T23:49:27.773 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:458: TEST_scrub_permit_time: sleep 1 2026-03-08T23:49:28.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i++ )) 2026-03-08T23:49:28.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:454: TEST_scrub_permit_time: (( i < 30 )) 2026-03-08T23:49:28.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:460: TEST_scrub_permit_time: perf_counters td/osd-scrub-test 3 2026-03-08T23:49:28.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:49:28.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:49:28.775 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:49:28.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:49:28.776 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:28.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:49:28.777 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:49:28.851 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.852 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.853 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.863 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:49:28.864 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.932 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.933 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.934 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:28.935 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:28.943 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:49:28.944 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:29.016 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.017 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:29.018 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:49:29.019 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:49:29.020 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:49:29.020 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:49:29.020 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:49:29.025 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_scrub_permit_time ------------------ 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_scrub_permit_time ------------------' 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:49:29.026 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:49:29.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:49:29.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:49:29.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:49:29.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:49:29.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:49:29.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:49:29.144 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:49:29.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:49:29.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:49:29.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:49:29.145 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:49:29.146 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:49:29.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:49:29.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:49:29.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:49:29.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_scrub_permit_time ------------------ 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Prepare Test TEST_scrub_test ------------------- 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_scrub_permit_time ------------------' 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:31: run: for func in $funcs 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:32: run: echo '-------------- Prepare Test TEST_scrub_test -------------------' 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:33: run: setup td/osd-scrub-test 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-scrub-test 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-scrub-test 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:49:29.160 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:49:29.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:49:29.162 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:49:29.162 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:49:29.163 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:49:29.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:49:29.164 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:49:29.164 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:49:29.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:49:29.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:49:29.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:49:29.165 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:49:29.166 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:49:29.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:49:29.167 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:49:29.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:49:29.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.168 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:49:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:49:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:49:29.169 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-scrub-test 2026-03-08T23:49:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:49:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.170 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.475827 2026-03-08T23:49:29.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:49:29.171 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Run Test TEST_scrub_test ----------------------- 2026-03-08T23:49:29.171 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 4096 -le 1024 ']' 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-scrub-test 1' TERM HUP INT 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:34: run: echo '-------------- Run Test TEST_scrub_test -----------------------' 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:35: run: TEST_scrub_test td/osd-scrub-test 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:52: TEST_scrub_test: local dir=td/osd-scrub-test 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:53: TEST_scrub_test: local poolname=test 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:54: TEST_scrub_test: local OSDS=3 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:55: TEST_scrub_test: local objects=15 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:57: TEST_scrub_test: TESTDATA=testdata.475827 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:59: TEST_scrub_test: run_mon td/osd-scrub-test a --osd_pool_default_size=3 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-scrub-test 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-scrub-test/a 2026-03-08T23:49:29.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-scrub-test/a --run-dir=td/osd-scrub-test --osd_pool_default_size=3 2026-03-08T23:49:29.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:49:29.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:29.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:29.196 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:29.197 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.197 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:29.197 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-scrub-test/a '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --mon-cluster-log-file=td/osd-scrub-test/log --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false --osd_pool_default_size=3 2026-03-08T23:49:29.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:49:29.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:49:29.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:49:29.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:49:29.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:49:29.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:49:29.226 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:49:29.226 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:49:29.226 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:49:29.227 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:49:29.227 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.227 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.227 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:49:29.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:49:29.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get fsid 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:49:29.294 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:49:29.295 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.295 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.295 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.475827/ceph-mon.a.asok 2026-03-08T23:49:29.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:49:29.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.475827/ceph-mon.a.asok config get mon_host 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:60: TEST_scrub_test: run_mgr td/osd-scrub-test x --mgr_stats_period=1 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-scrub-test 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-scrub-test/x 2026-03-08T23:49:29.356 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.461 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:29.462 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:49:29.463 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-scrub-test/x '--log-file=td/osd-scrub-test/$name.log' '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --run-dir=td/osd-scrub-test '--pid-file=td/osd-scrub-test/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr --mgr_stats_period=1 2026-03-08T23:49:29.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:61: TEST_scrub_test: local 'ceph_osd_args=--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 ' 2026-03-08T23:49:29.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:62: TEST_scrub_test: ceph_osd_args+='--osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 ' 2026-03-08T23:49:29.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:63: TEST_scrub_test: ceph_osd_args+=--osd_stats_update_period_scrubbing=2 2026-03-08T23:49:29.483 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:64: TEST_scrub_test: expr 3 - 1 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:64: TEST_scrub_test: seq 0 2 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:64: TEST_scrub_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:66: TEST_scrub_test: run_osd td/osd-scrub-test 0 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/0 2026-03-08T23:49:29.489 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/0' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/0/journal' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:29.490 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:29.492 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:29.492 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:29.492 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:29.492 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2' 2026-03-08T23:49:29.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/0 2026-03-08T23:49:29.494 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:49:29.495 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 fbb972ff-97cf-4d44-90d5-6a1e226517aa 2026-03-08T23:49:29.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=fbb972ff-97cf-4d44-90d5-6a1e226517aa 2026-03-08T23:49:29.495 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 fbb972ff-97cf-4d44-90d5-6a1e226517aa' 2026-03-08T23:49:29.495 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:49:29.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAJC65pcunRHhAAPnUqafMf2ad2kzSc4s2qlw== 2026-03-08T23:49:29.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAJC65pcunRHhAAPnUqafMf2ad2kzSc4s2qlw=="}' 2026-03-08T23:49:29.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new fbb972ff-97cf-4d44-90d5-6a1e226517aa -i td/osd-scrub-test/0/new.json 2026-03-08T23:49:29.607 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:49:29.622 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/0/new.json 2026-03-08T23:49:29.623 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --mkfs --key AQAJC65pcunRHhAAPnUqafMf2ad2kzSc4s2qlw== --osd-uuid fbb972ff-97cf-4d44-90d5-6a1e226517aa 2026-03-08T23:49:29.639 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:29.645+0000 7f8dc37938c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:29.641 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:29.649+0000 7f8dc37938c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:29.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:29.649+0000 7f8dc37938c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:29.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:29.649+0000 7f8dc37938c0 -1 bdev(0x556be19ccc00 td/osd-scrub-test/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:49:29.643 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:29.649+0000 7f8dc37938c0 -1 bluestore(td/osd-scrub-test/0) _read_fsid unparsable uuid 2026-03-08T23:49:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/0/keyring 2026-03-08T23:49:31.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:49:31.894 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:49:31.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:49:31.894 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:49:32.012 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:49:32.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:49:32.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/0 --osd-journal=td/osd-scrub-test/0/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:32.012 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:49:32.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:49:32.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:49:32.028 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:32.037+0000 7f3693cf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:32.034 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:32.041+0000 7f3693cf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:32.036 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:32.041+0000 7f3693cf88c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:32.190 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:49:32.351 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:32.741 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:32.749+0000 7f3693cf88c0 -1 Falling back to public interface 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:33.352 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:49:33.512 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:33.717 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:33.725+0000 7f3693cf88c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:49:34.513 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:34.683 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:34.689+0000 7f368f4b1640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:49:34.734 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:35.735 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:49:35.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:35.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:35.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:49:35.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:35.736 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:49:35.907 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/2325065186,v1:127.0.0.1:6803/2325065186] [v2:127.0.0.1:6804/2325065186,v1:127.0.0.1:6805/2325065186] exists,up fbb972ff-97cf-4d44-90d5-6a1e226517aa 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:64: TEST_scrub_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:66: TEST_scrub_test: run_osd td/osd-scrub-test 1 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/1 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/1' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/1/journal' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:35.908 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2' 2026-03-08T23:49:35.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/1 2026-03-08T23:49:35.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:49:35.911 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 a6a5788a-349b-48e6-a35a-ec9cade43440 2026-03-08T23:49:35.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=a6a5788a-349b-48e6-a35a-ec9cade43440 2026-03-08T23:49:35.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 a6a5788a-349b-48e6-a35a-ec9cade43440' 2026-03-08T23:49:35.911 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:49:35.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAPC65ptMWiNxAAkqqIH1tQB8b4G66whWQEgQ== 2026-03-08T23:49:35.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAPC65ptMWiNxAAkqqIH1tQB8b4G66whWQEgQ=="}' 2026-03-08T23:49:35.922 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new a6a5788a-349b-48e6-a35a-ec9cade43440 -i td/osd-scrub-test/1/new.json 2026-03-08T23:49:36.085 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:49:36.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/1/new.json 2026-03-08T23:49:36.102 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --mkfs --key AQAPC65ptMWiNxAAkqqIH1tQB8b4G66whWQEgQ== --osd-uuid a6a5788a-349b-48e6-a35a-ec9cade43440 2026-03-08T23:49:36.118 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:36.125+0000 7fc48320c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:36.119 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:36.125+0000 7fc48320c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:36.120 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:36.129+0000 7fc48320c8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:36.120 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:36.129+0000 7fc48320c8c0 -1 bdev(0x5572c8aa9c00 td/osd-scrub-test/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:49:36.121 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:36.129+0000 7fc48320c8c0 -1 bluestore(td/osd-scrub-test/1) _read_fsid unparsable uuid 2026-03-08T23:49:38.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/1/keyring 2026-03-08T23:49:38.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:49:38.377 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:49:38.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:49:38.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:49:38.573 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:49:38.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:49:38.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/1 --osd-journal=td/osd-scrub-test/1/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:38.573 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:49:38.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:49:38.576 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:49:38.590 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:38.597+0000 7f4deacb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:38.590 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:38.597+0000 7f4deacb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:38.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:38.597+0000 7f4deacb48c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:38.750 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:49:38.915 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:39.549 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:39.557+0000 7f4deacb48c0 -1 Falling back to public interface 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:39.916 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:49:40.081 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:40.518 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:40.525+0000 7f4deacb48c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:41.083 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:49:41.261 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:41.472 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:41.481+0000 7f4de646d640 -1 osd.1 0 waiting for initial osdmap 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:42.262 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2405761458,v1:127.0.0.1:6811/2405761458] [v2:127.0.0.1:6812/2405761458,v1:127.0.0.1:6813/2405761458] exists,up a6a5788a-349b-48e6-a35a-ec9cade43440 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:64: TEST_scrub_test: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:66: TEST_scrub_test: run_osd td/osd-scrub-test 2 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-scrub-test 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:49:42.427 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:42.428 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2' 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:49:42.429 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:49:42.430 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 18699f64-12a1-4999-bce2-0a12a3c0e760 2026-03-08T23:49:42.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=18699f64-12a1-4999-bce2-0a12a3c0e760 2026-03-08T23:49:42.430 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 18699f64-12a1-4999-bce2-0a12a3c0e760' 2026-03-08T23:49:42.430 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:49:42.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQAWC65pHgr6GhAAjWMwRO3kIAPSU5KumTKd8w== 2026-03-08T23:49:42.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQAWC65pHgr6GhAAjWMwRO3kIAPSU5KumTKd8w=="}' 2026-03-08T23:49:42.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 18699f64-12a1-4999-bce2-0a12a3c0e760 -i td/osd-scrub-test/2/new.json 2026-03-08T23:49:42.603 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:49:42.619 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-scrub-test/2/new.json 2026-03-08T23:49:42.620 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 --mkfs --key AQAWC65pHgr6GhAAjWMwRO3kIAPSU5KumTKd8w== --osd-uuid 18699f64-12a1-4999-bce2-0a12a3c0e760 2026-03-08T23:49:42.636 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:42.645+0000 7fab106298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:42.638 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:42.645+0000 7fab106298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:42.639 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:42.645+0000 7fab106298c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:42.639 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:42.645+0000 7fab106298c0 -1 bdev(0x555e4d3dbc00 td/osd-scrub-test/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:49:42.640 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:42.645+0000 7fab106298c0 -1 bluestore(td/osd-scrub-test/2) _read_fsid unparsable uuid 2026-03-08T23:49:44.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-scrub-test/2/keyring 2026-03-08T23:49:44.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:49:44.990 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:49:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:49:44.990 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-scrub-test/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:49:45.188 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:49:45.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:49:45.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:45.188 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:49:45.189 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:49:45.192 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:49:45.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:45.213+0000 7f46831738c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:45.206 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:45.213+0000 7f46831738c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:45.208 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:45.213+0000 7f46831738c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:45.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:45.529 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:46.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:46.645 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:46.653+0000 7f46831738c0 -1 Falling back to public interface 2026-03-08T23:49:46.699 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:47.629 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:47.637+0000 7f46831738c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:47.700 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:47.869 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:48.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:48.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:48.870 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:49:48.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:49:48.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:48.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/3244030047,v1:127.0.0.1:6819/3244030047] [v2:127.0.0.1:6820/3244030047,v1:127.0.0.1:6821/3244030047] exists,up 18699f64-12a1-4999-bce2-0a12a3c0e760 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:70: TEST_scrub_test: create_pool test 1 1 2026-03-08T23:49:49.036 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:541: create_pool: ceph osd pool create test 1 1 2026-03-08T23:49:49.245 INFO:tasks.workunit.client.0.vm03.stderr:pool 'test' created 2026-03-08T23:49:49.263 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:542: create_pool: sleep 1 2026-03-08T23:49:50.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:71: TEST_scrub_test: wait_for_clean 2026-03-08T23:49:50.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:49:50.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:49:50.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:49:50.264 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:49:50.265 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:49:50.265 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:49:50.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:49:50.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:49:50.265 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:49:50.333 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:49:50.493 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:49:50.517 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:49:50.518 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:49:50.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:49:50.518 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:50.518 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:49:50.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836497 2026-03-08T23:49:50.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836497 2026-03-08T23:49:50.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497' 2026-03-08T23:49:50.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:50.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:49:50.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672970 2026-03-08T23:49:50.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672970 2026-03-08T23:49:50.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672970' 2026-03-08T23:49:50.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:50.669 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:49:50.745 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509443 2026-03-08T23:49:50.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509443 2026-03-08T23:49:50.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836497 1-42949672970 2-64424509443' 2026-03-08T23:49:50.746 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:49:50.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836497 2026-03-08T23:49:50.746 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:49:50.747 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:49:50.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836497 2026-03-08T23:49:50.747 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:49:50.748 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836497 2026-03-08T23:49:50.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836497 2026-03-08T23:49:50.748 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836497' 2026-03-08T23:49:50.748 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:49:50.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836497 2026-03-08T23:49:50.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:49:51.914 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:49:51.914 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:49:52.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836499 -lt 21474836497 2026-03-08T23:49:52.089 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:49:52.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672970 2026-03-08T23:49:52.090 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:49:52.090 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:49:52.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672970 2026-03-08T23:49:52.091 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:49:52.092 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672970 2026-03-08T23:49:52.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672970 2026-03-08T23:49:52.092 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672970' 2026-03-08T23:49:52.092 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:49:52.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672972 -lt 42949672970 2026-03-08T23:49:52.256 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:49:52.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509443 2026-03-08T23:49:52.257 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:49:52.257 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:49:52.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509443 2026-03-08T23:49:52.258 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:49:52.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509443 2026-03-08T23:49:52.259 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509443 2026-03-08T23:49:52.259 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509443' 2026-03-08T23:49:52.259 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:49:52.426 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509443 2026-03-08T23:49:52.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:49:52.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:49:52.427 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:49:52.632 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:49:52.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:49:52.799 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:49:52.799 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:49:52.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:49:52.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:49:53.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:49:53.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:49:53.005 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:49:53.005 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:72: TEST_scrub_test: ceph osd dump 2026-03-08T23:49:53.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:72: TEST_scrub_test: awk '{ print $2 }' 2026-03-08T23:49:53.007 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:72: TEST_scrub_test: grep '^pool.*['\'']test['\'']' 2026-03-08T23:49:53.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:72: TEST_scrub_test: poolid=1 2026-03-08T23:49:53.168 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:74: TEST_scrub_test: dd if=/dev/urandom of=testdata.475827 bs=1032 count=1 2026-03-08T23:49:53.169 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:49:53.169 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:49:53.169 INFO:tasks.workunit.client.0.vm03.stderr:1032 bytes (1.0 kB, 1.0 KiB) copied, 6.6524e-05 s, 15.5 MB/s 2026-03-08T23:49:53.169 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: seq 1 15 2026-03-08T23:49:53.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.170 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj1 testdata.475827 2026-03-08T23:49:53.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.191 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj2 testdata.475827 2026-03-08T23:49:53.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.213 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj3 testdata.475827 2026-03-08T23:49:53.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj4 testdata.475827 2026-03-08T23:49:53.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj5 testdata.475827 2026-03-08T23:49:53.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.275 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj6 testdata.475827 2026-03-08T23:49:53.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj7 testdata.475827 2026-03-08T23:49:53.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj8 testdata.475827 2026-03-08T23:49:53.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj9 testdata.475827 2026-03-08T23:49:53.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.359 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj10 testdata.475827 2026-03-08T23:49:53.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj11 testdata.475827 2026-03-08T23:49:53.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.399 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj12 testdata.475827 2026-03-08T23:49:53.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.419 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj13 testdata.475827 2026-03-08T23:49:53.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj14 testdata.475827 2026-03-08T23:49:53.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:75: TEST_scrub_test: for i in `seq 1 $objects` 2026-03-08T23:49:53.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:77: TEST_scrub_test: rados -p test put obj15 testdata.475827 2026-03-08T23:49:53.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:79: TEST_scrub_test: rm -f testdata.475827 2026-03-08T23:49:53.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:81: TEST_scrub_test: get_primary test obj1 2026-03-08T23:49:53.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:49:53.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:49:53.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:49:53.482 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:81: TEST_scrub_test: local primary=1 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:82: TEST_scrub_test: get_not_primary test obj1 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1229: get_not_primary: local poolname=test 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1230: get_not_primary: local objectname=obj1 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: get_primary test obj1 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=test 2026-03-08T23:49:53.652 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=obj1 2026-03-08T23:49:53.653 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map test obj1 2026-03-08T23:49:53.653 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:49:53.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1232: get_not_primary: local primary=1 2026-03-08T23:49:53.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1233: get_not_primary: ceph --format json osd map test obj1 2026-03-08T23:49:53.842 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1234: get_not_primary: jq '.acting | map(select (. != 1)) | .[0]' 2026-03-08T23:49:54.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:82: TEST_scrub_test: local otherosd=0 2026-03-08T23:49:54.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:83: TEST_scrub_test: '[' 0 = 2 ']' 2026-03-08T23:49:54.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:87: TEST_scrub_test: local anotherosd=2 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:90: TEST_scrub_test: CORRUPT_DATA=corrupt-data.475827 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:91: TEST_scrub_test: dd if=/dev/urandom of=corrupt-data.475827 bs=512 count=1 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records in 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:1+0 records out 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:512 bytes copied, 6.6124e-05 s, 7.7 MB/s 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:92: TEST_scrub_test: objectstore_tool td/osd-scrub-test 2 obj1 set-bytes corrupt-data.475827 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-scrub-test 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-scrub-test 2 obj1 set-bytes corrupt-data.475827 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-scrub-test 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:49:54.020 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-scrub-test TERM osd.2 2026-03-08T23:49:54.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:49:54.021 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:49:54.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:49:54.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:49:54.021 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-scrub-test 2 obj1 set-bytes corrupt-data.475827 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-scrub-test 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-scrub-test/2 2026-03-08T23:49:54.126 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-scrub-test/2 obj1 set-bytes corrupt-data.475827 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-scrub-test 2 --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-scrub-test 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-scrub-test/2 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 ' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-scrub-test/2' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-scrub-test/2/journal' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-scrub-test' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:49:55.320 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-scrub-test/$name.log' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-scrub-test/$name.pid' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+='--osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2' 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-scrub-test/2 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:49:55.321 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:49:55.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=e201f9a4-1c53-4cf8-a688-4191c55e504b --auth-supported=none --mon-host=127.0.0.1:7138 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-scrub-test/2 --osd-journal=td/osd-scrub-test/2/journal --chdir= --run-dir=td/osd-scrub-test '--admin-socket=/tmp/ceph-asok.475827/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-scrub-test/$name.log' '--pid-file=td/osd-scrub-test/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --osd-scrub-interval-randomize-ratio=0 --osd-deep-scrub-randomize-ratio=0 --osd_scrub_backoff_ratio=0 --osd_stats_update_period_not_scrubbing=3 --osd_stats_update_period_scrubbing=2 2026-03-08T23:49:55.322 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-scrub-test/2/whoami 2026-03-08T23:49:55.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:49:55.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:49:55.324 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:49:55.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:49:55.338 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:55.345+0000 7f9c4c2708c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:55.345 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:55.353+0000 7f9c4c2708c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:55.346 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:55.353+0000 7f9c4c2708c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:55.500 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:55.668 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:56.301 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:56.309+0000 7f9c4c2708c0 -1 Falling back to public interface 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:56.669 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:56.836 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:57.293 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:49:57.301+0000 7f9c4c2708c0 -1 osd.2 19 log_to_monitors true 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:57.837 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:58.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 500 )) 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:49:59.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:49:59.180 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 23 up_thru 0 down_at 20 last_clean_interval [15,19) [v2:127.0.0.1:6818/3186276210,v1:127.0.0.1:6819/3186276210] [v2:127.0.0.1:6820/3186276210,v1:127.0.0.1:6821/3186276210] exists,up 18699f64-12a1-4999-bce2-0a12a3c0e760 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:49:59.181 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:49:59.239 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:59.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:49:59.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836507 2026-03-08T23:49:59.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836507 2026-03-08T23:49:59.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836507' 2026-03-08T23:49:59.486 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:59.487 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:49:59.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672980 2026-03-08T23:49:59.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672980 2026-03-08T23:49:59.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836507 1-42949672980' 2026-03-08T23:49:59.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:49:59.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:49:59.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247812 2026-03-08T23:49:59.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247812 2026-03-08T23:49:59.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836507 1-42949672980 2-98784247812' 2026-03-08T23:49:59.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:49:59.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836507 2026-03-08T23:49:59.647 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:49:59.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:49:59.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836507 2026-03-08T23:49:59.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:49:59.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836507 2026-03-08T23:49:59.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836507' 2026-03-08T23:49:59.649 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836507 2026-03-08T23:49:59.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:49:59.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836505 -lt 21474836507 2026-03-08T23:49:59.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:00.808 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:00.808 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:00.979 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836508 -lt 21474836507 2026-03-08T23:50:00.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:00.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672980 2026-03-08T23:50:00.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:00.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:00.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672980 2026-03-08T23:50:00.981 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:00.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672980 2026-03-08T23:50:00.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672980' 2026-03-08T23:50:00.982 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 42949672980 2026-03-08T23:50:00.983 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:01.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672981 -lt 42949672980 2026-03-08T23:50:01.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:01.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247812 2026-03-08T23:50:01.158 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:01.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:01.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247812 2026-03-08T23:50:01.160 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:01.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247812 2026-03-08T23:50:01.161 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247812' 2026-03-08T23:50:01.161 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 98784247812 2026-03-08T23:50:01.161 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:01.336 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247812 -lt 98784247812 2026-03-08T23:50:01.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:01.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:01.336 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:01.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:01.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:01.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:01.707 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:01.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:01.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:01.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:01.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:01.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:01.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:01.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:93: TEST_scrub_test: rm -f corrupt-data.475827 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:95: TEST_scrub_test: local pgid=1.0 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:96: TEST_scrub_test: pg_deep_scrub 1.0 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=1.0 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 1.0 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:01.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:02.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:50:02.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:50:02.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:50:02.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:50:02.048 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:02.049 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:02.221 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:02.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836511 2026-03-08T23:50:02.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836511 2026-03-08T23:50:02.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836511' 2026-03-08T23:50:02.299 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:02.299 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672984 2026-03-08T23:50:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672984 2026-03-08T23:50:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836511 1-42949672984' 2026-03-08T23:50:02.378 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:02.378 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247816 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247816 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836511 1-42949672984 2-98784247816' 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836511 2026-03-08T23:50:02.457 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:02.458 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:02.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836511 2026-03-08T23:50:02.458 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:02.459 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836511 2026-03-08T23:50:02.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836511 2026-03-08T23:50:02.459 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836511' 2026-03-08T23:50:02.459 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:02.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836511 2026-03-08T23:50:02.632 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:03.633 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:03.633 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:03.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836510 -lt 21474836511 2026-03-08T23:50:03.807 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:04.809 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:50:04.809 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:04.967 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836513 -lt 21474836511 2026-03-08T23:50:04.968 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:04.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672984 2026-03-08T23:50:04.968 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:04.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:04.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672984 2026-03-08T23:50:04.969 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:04.970 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672984 2026-03-08T23:50:04.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672984 2026-03-08T23:50:04.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672984' 2026-03-08T23:50:04.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:05.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672986 -lt 42949672984 2026-03-08T23:50:05.136 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:05.136 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:05.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247816 2026-03-08T23:50:05.137 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:05.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247816 2026-03-08T23:50:05.137 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:05.138 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247816 2026-03-08T23:50:05.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247816 2026-03-08T23:50:05.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247816' 2026-03-08T23:50:05.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247817 -lt 98784247816 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:50:05.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:50:05.385 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:05.386 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:05.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:49:49.256237+0000 2026-03-08T23:50:05.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 1.0 2026-03-08T23:50:05.699 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to deep-scrub 2026-03-08T23:50:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 1.0 2026-03-08T23:49:49.256237+0000 last_deep_scrub_stamp 2026-03-08T23:50:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:50:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:49:49.256237+0000 2026-03-08T23:50:05.712 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:05.713 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:05.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:49:49.256237+0000 '>' 2026-03-08T23:49:49.256237+0000 2026-03-08T23:50:05.870 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:06.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:06.871 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:06.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:06.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:06.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:06.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:06.872 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:49:49.256237+0000 '>' 2026-03-08T23:49:49.256237+0000 2026-03-08T23:50:07.033 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:08.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:08.034 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:08.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:08.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:08.034 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:08.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:08.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:08.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:06.503416+0000 '>' 2026-03-08T23:49:49.256237+0000 2026-03-08T23:50:08.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:50:08.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:98: TEST_scrub_test: ceph pg dump pgs 2026-03-08T23:50:08.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:98: TEST_scrub_test: grep -q -- +inconsistent 2026-03-08T23:50:08.203 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:98: TEST_scrub_test: grep '^1.0' 2026-03-08T23:50:08.352 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:50:08.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:99: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:08.367 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:99: TEST_scrub_test: jq .info.stats.stat_sum.num_scrub_errors 2026-03-08T23:50:08.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:99: TEST_scrub_test: test 2 = 2 2026-03-08T23:50:08.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:101: TEST_scrub_test: ceph osd out 1 2026-03-08T23:50:08.643 INFO:tasks.workunit.client.0.vm03.stderr:marked out osd.1. 2026-03-08T23:50:08.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:102: TEST_scrub_test: wait_for_clean 2026-03-08T23:50:08.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:08.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:08.660 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:08.660 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:08.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:08.661 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:08.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:08.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:08.661 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:08.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:08.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:08.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:08.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:08.730 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:08.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:08.904 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:08.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836519 2026-03-08T23:50:08.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836519 2026-03-08T23:50:08.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519' 2026-03-08T23:50:08.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:08.980 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:09.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672992 2026-03-08T23:50:09.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672992 2026-03-08T23:50:09.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-42949672992' 2026-03-08T23:50:09.059 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:09.059 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247823 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247823 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836519 1-42949672992 2-98784247823' 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836519 2026-03-08T23:50:09.133 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:09.134 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:09.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836519 2026-03-08T23:50:09.134 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:09.135 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836519 2026-03-08T23:50:09.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836519 2026-03-08T23:50:09.135 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836519' 2026-03-08T23:50:09.135 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:09.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836517 -lt 21474836519 2026-03-08T23:50:09.294 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:10.295 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:10.295 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:10.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836520 -lt 21474836519 2026-03-08T23:50:10.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:10.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672992 2026-03-08T23:50:10.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:10.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:10.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672992 2026-03-08T23:50:10.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:10.454 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672992 2026-03-08T23:50:10.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672992 2026-03-08T23:50:10.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672992' 2026-03-08T23:50:10.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:10.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672993 -lt 42949672992 2026-03-08T23:50:10.646 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:10.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247823 2026-03-08T23:50:10.646 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:10.647 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:10.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247823 2026-03-08T23:50:10.648 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:10.649 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247823 2026-03-08T23:50:10.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247823 2026-03-08T23:50:10.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247823' 2026-03-08T23:50:10.649 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:10.805 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247824 -lt 98784247823 2026-03-08T23:50:10.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:10.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:10.805 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:10.991 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:10.992 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:11.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:11.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:11.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:11.141 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:104: TEST_scrub_test: pg_deep_scrub 1.0 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1941: pg_deep_scrub: local pgid=1.0 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1943: pg_deep_scrub: wait_for_pg_clean 1.0 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1711: wait_for_pg_clean: local pg_id=1.0 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: get_timeout_delays 90 1 3 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:11.325 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: delays=('1' '2' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3' '3') 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1712: wait_for_pg_clean: local -a delays 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1713: wait_for_pg_clean: local -i loop=0 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1715: wait_for_pg_clean: flush_pg_stats 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:11.460 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:11.618 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836522 2026-03-08T23:50:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836522 2026-03-08T23:50:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522' 2026-03-08T23:50:11.688 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:11.688 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:11.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672996 2026-03-08T23:50:11.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672996 2026-03-08T23:50:11.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949672996' 2026-03-08T23:50:11.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:11.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:11.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247827 2026-03-08T23:50:11.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247827 2026-03-08T23:50:11.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836522 1-42949672996 2-98784247827' 2026-03-08T23:50:11.823 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:11.823 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836522 2026-03-08T23:50:11.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:11.824 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836522 2026-03-08T23:50:11.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:11.825 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836522 2026-03-08T23:50:11.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836522 2026-03-08T23:50:11.826 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836522' 2026-03-08T23:50:11.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:11.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836520 -lt 21474836522 2026-03-08T23:50:11.975 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:12.976 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:12.976 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:13.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836523 -lt 21474836522 2026-03-08T23:50:13.138 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:13.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672996 2026-03-08T23:50:13.138 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:13.139 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:13.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672996 2026-03-08T23:50:13.139 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:13.140 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672996 2026-03-08T23:50:13.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672996 2026-03-08T23:50:13.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672996' 2026-03-08T23:50:13.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:13.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672996 -lt 42949672996 2026-03-08T23:50:13.301 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:13.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247827 2026-03-08T23:50:13.301 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:13.302 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:13.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247827 2026-03-08T23:50:13.303 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:13.304 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247827 2026-03-08T23:50:13.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247827 2026-03-08T23:50:13.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247827' 2026-03-08T23:50:13.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247827 -lt 98784247827 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stdout:#---------- 1.0 loop 0 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1717: wait_for_pg_clean: true 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1718: wait_for_pg_clean: echo '#---------- 1.0 loop 0' 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: is_pg_clean 1.0 2026-03-08T23:50:13.467 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1581: is_pg_clean: local pgid=1.0 2026-03-08T23:50:13.468 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1582: is_pg_clean: local pg_state 2026-03-08T23:50:13.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: ceph pg 1.0 query 2026-03-08T23:50:13.468 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: jq -r '.state ' 2026-03-08T23:50:13.546 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1583: is_pg_clean: pg_state=active+clean+remapped 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1584: is_pg_clean: [[ active+clean+remapped == \a\c\t\i\v\e\+\c\l\e\a\n* ]] 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1719: wait_for_pg_clean: break 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1728: wait_for_pg_clean: return 0 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:13.547 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:13.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1944: pg_deep_scrub: local last_scrub=2026-03-08T23:50:06.503416+0000 2026-03-08T23:50:13.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1945: pg_deep_scrub: ceph pg deep-scrub 1.0 2026-03-08T23:50:13.871 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.0 to deep-scrub 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1946: pg_deep_scrub: wait_for_scrub 1.0 2026-03-08T23:50:06.503416+0000 last_deep_scrub_stamp 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:50:06.503416+0000 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_deep_scrub_stamp 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:13.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:14.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:06.503416+0000 '>' 2026-03-08T23:50:06.503416+0000 2026-03-08T23:50:14.052 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:15.053 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:15.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:06.503416+0000 '>' 2026-03-08T23:50:06.503416+0000 2026-03-08T23:50:15.215 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:16.216 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_deep_scrub_stamp 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_deep_scrub_stamp 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:16.217 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_deep_scrub_stamp' 2026-03-08T23:50:16.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:14.556747+0000 '>' 2026-03-08T23:50:06.503416+0000 2026-03-08T23:50:16.382 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:50:16.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:106: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:16.382 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:106: TEST_scrub_test: jq .info.stats.stat_sum.num_scrub_errors 2026-03-08T23:50:16.462 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:106: TEST_scrub_test: test 2 = 2 2026-03-08T23:50:16.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:107: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:16.463 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:107: TEST_scrub_test: jq '.peer_info[0].stats.stat_sum.num_scrub_errors' 2026-03-08T23:50:16.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:107: TEST_scrub_test: test 2 = 2 2026-03-08T23:50:16.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:108: TEST_scrub_test: ceph pg dump pgs 2026-03-08T23:50:16.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:108: TEST_scrub_test: grep -q -- +inconsistent 2026-03-08T23:50:16.552 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:108: TEST_scrub_test: grep '^1.0' 2026-03-08T23:50:16.706 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:50:16.719 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:110: TEST_scrub_test: ceph osd in 1 2026-03-08T23:50:16.927 INFO:tasks.workunit.client.0.vm03.stderr:marked in osd.1. 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:111: TEST_scrub_test: wait_for_clean 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:16.945 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:17.003 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:17.176 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:17.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836529 2026-03-08T23:50:17.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836529 2026-03-08T23:50:17.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529' 2026-03-08T23:50:17.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:17.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673002 2026-03-08T23:50:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673002 2026-03-08T23:50:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673002' 2026-03-08T23:50:17.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:17.329 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247834 2026-03-08T23:50:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247834 2026-03-08T23:50:17.408 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836529 1-42949673002 2-98784247834' 2026-03-08T23:50:17.409 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:17.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836529 2026-03-08T23:50:17.409 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:17.410 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:17.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836529 2026-03-08T23:50:17.410 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:17.411 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836529 2026-03-08T23:50:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836529 2026-03-08T23:50:17.411 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836529' 2026-03-08T23:50:17.411 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:17.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836527 -lt 21474836529 2026-03-08T23:50:17.577 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:18.579 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:18.579 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:18.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836530 -lt 21474836529 2026-03-08T23:50:18.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:18.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673002 2026-03-08T23:50:18.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:18.754 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:18.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673002 2026-03-08T23:50:18.755 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:18.756 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673002 2026-03-08T23:50:18.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673002 2026-03-08T23:50:18.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673002' 2026-03-08T23:50:18.756 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:18.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673003 -lt 42949673002 2026-03-08T23:50:18.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:18.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247834 2026-03-08T23:50:18.938 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:18.939 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:18.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247834 2026-03-08T23:50:18.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:18.941 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247834 2026-03-08T23:50:18.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247834 2026-03-08T23:50:18.941 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247834' 2026-03-08T23:50:18.941 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:19.112 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247834 -lt 98784247834 2026-03-08T23:50:19.112 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:19.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:19.113 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:19.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:19.509 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:19.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:19.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:19.509 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:113: TEST_scrub_test: repair 1.0 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1896: repair: local pgid=1.0 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: get_last_scrub_stamp 1.0 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:19.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:50:19.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1897: repair: local last_scrub=2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:19.901 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1898: repair: ceph pg repair 1.0 2026-03-08T23:50:20.056 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to repair 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1899: repair: wait_for_scrub 1.0 2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2072: wait_for_scrub: local pgid=1.0 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2073: wait_for_scrub: local last_scrub=2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2074: wait_for_scrub: local sname=last_scrub_stamp 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i=0 )) 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:20.068 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:50:20.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:20.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:50:20.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:20.069 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:50:20.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:14.556747+0000 '>' 2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:20.233 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:21.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:21.234 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:21.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:50:21.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:21.234 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:50:21.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:21.235 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:50:21.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:14.556747+0000 '>' 2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:21.404 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2080: wait_for_scrub: sleep 1 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i++ )) 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2076: wait_for_scrub: (( i < 500 )) 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: get_last_scrub_stamp 1.0 last_scrub_stamp 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1526: get_last_scrub_stamp: local pgid=1.0 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1527: get_last_scrub_stamp: local sname=last_scrub_stamp 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1528: get_last_scrub_stamp: ceph --format json pg dump pgs 2026-03-08T23:50:22.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1529: get_last_scrub_stamp: jq -r '.pg_stats | .[] | select(.pgid=="1.0") | .last_scrub_stamp' 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2077: wait_for_scrub: test 2026-03-08T23:50:20.497321+0000 '>' 2026-03-08T23:50:14.556747+0000 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2078: wait_for_scrub: return 0 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:114: TEST_scrub_test: wait_for_clean 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:22.574 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:22.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:22.802 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:22.802 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:22.802 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:22.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:22.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:22.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:22.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836535 2026-03-08T23:50:22.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836535 2026-03-08T23:50:22.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535' 2026-03-08T23:50:22.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:22.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:22.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673009 2026-03-08T23:50:22.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673009 2026-03-08T23:50:22.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535 1-42949673009' 2026-03-08T23:50:22.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:22.960 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:23.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247840 2026-03-08T23:50:23.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247840 2026-03-08T23:50:23.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836535 1-42949673009 2-98784247840' 2026-03-08T23:50:23.038 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:23.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836535 2026-03-08T23:50:23.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:23.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:23.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836535 2026-03-08T23:50:23.040 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:23.041 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836535 2026-03-08T23:50:23.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836535 2026-03-08T23:50:23.041 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836535' 2026-03-08T23:50:23.041 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:23.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836534 -lt 21474836535 2026-03-08T23:50:23.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:24.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:24.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:24.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836537 -lt 21474836535 2026-03-08T23:50:24.368 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:24.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673009 2026-03-08T23:50:24.369 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:24.369 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:24.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673009 2026-03-08T23:50:24.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:24.371 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673009 2026-03-08T23:50:24.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673009 2026-03-08T23:50:24.371 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673009' 2026-03-08T23:50:24.371 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:24.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673010 -lt 42949673009 2026-03-08T23:50:24.538 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:24.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247840 2026-03-08T23:50:24.538 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:24.539 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:24.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247840 2026-03-08T23:50:24.539 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:24.540 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247840 2026-03-08T23:50:24.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247840 2026-03-08T23:50:24.540 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247840' 2026-03-08T23:50:24.540 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:24.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247841 -lt 98784247840 2026-03-08T23:50:24.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:24.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:24.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:24.897 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:24.898 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:25.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:25.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:25.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:25.061 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:25.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:25.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:25.249 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:25.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:117: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:25.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:117: TEST_scrub_test: jq '.peer_info[0].stats.stat_sum.num_scrub_errors' 2026-03-08T23:50:25.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:117: TEST_scrub_test: test 2 = 2 2026-03-08T23:50:25.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:118: TEST_scrub_test: ceph pg dump pgs 2026-03-08T23:50:25.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:118: TEST_scrub_test: grep -vq -- +inconsistent 2026-03-08T23:50:25.330 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:118: TEST_scrub_test: grep '^1.0' 2026-03-08T23:50:25.488 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:50:25.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:120: TEST_scrub_test: ceph osd out 1 2026-03-08T23:50:25.717 INFO:tasks.workunit.client.0.vm03.stderr:marked out osd.1. 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:121: TEST_scrub_test: wait_for_clean 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:25.738 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:25.739 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=500 2026-03-08T23:50:25.803 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:25.985 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:26.070 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836540 2026-03-08T23:50:26.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836540 2026-03-08T23:50:26.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836540' 2026-03-08T23:50:26.071 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:26.071 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:26.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949673013 2026-03-08T23:50:26.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949673013 2026-03-08T23:50:26.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836540 1-42949673013' 2026-03-08T23:50:26.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:26.152 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247844 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247844 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836540 1-42949673013 2-98784247844' 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836540 2026-03-08T23:50:26.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:26.230 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:26.230 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836540 2026-03-08T23:50:26.231 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:26.231 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836540 2026-03-08T23:50:26.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836540 2026-03-08T23:50:26.232 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836540' 2026-03-08T23:50:26.232 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:26.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836539 -lt 21474836540 2026-03-08T23:50:26.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:27.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 500 -eq 0 ']' 2026-03-08T23:50:27.396 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:27.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836539 -lt 21474836540 2026-03-08T23:50:27.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:28.564 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 499 -eq 0 ']' 2026-03-08T23:50:28.564 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:28.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836542 -lt 21474836540 2026-03-08T23:50:28.729 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:28.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949673013 2026-03-08T23:50:28.729 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:28.731 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:28.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949673013 2026-03-08T23:50:28.731 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:28.732 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949673013 2026-03-08T23:50:28.732 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949673013 2026-03-08T23:50:28.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949673013' 2026-03-08T23:50:28.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:28.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949673015 -lt 42949673013 2026-03-08T23:50:28.909 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:28.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-98784247844 2026-03-08T23:50:28.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:28.911 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:28.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-98784247844 2026-03-08T23:50:28.912 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:28.913 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 98784247844 2026-03-08T23:50:28.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247844 2026-03-08T23:50:28.913 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 98784247844' 2026-03-08T23:50:28.913 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:29.079 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247846 -lt 98784247844 2026-03-08T23:50:29.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:29.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:29.079 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:29.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:29.290 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:29.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:29.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:29.290 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:29.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:29.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:29.291 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:29.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:29.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:29.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:29.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:29.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:29.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:29.672 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:29.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:123: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:29.672 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:123: TEST_scrub_test: jq .info.stats.stat_sum.num_scrub_errors 2026-03-08T23:50:29.752 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:123: TEST_scrub_test: test 0 = 0 2026-03-08T23:50:29.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:124: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:29.753 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:124: TEST_scrub_test: jq '.peer_info[0].stats.stat_sum.num_scrub_errors' 2026-03-08T23:50:29.834 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:124: TEST_scrub_test: test 0 = 0 2026-03-08T23:50:29.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:125: TEST_scrub_test: ceph pg 1.0 query 2026-03-08T23:50:29.834 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:125: TEST_scrub_test: jq '.peer_info[1].stats.stat_sum.num_scrub_errors' 2026-03-08T23:50:29.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:125: TEST_scrub_test: test 0 = 0 2026-03-08T23:50:29.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:126: TEST_scrub_test: ceph pg dump pgs 2026-03-08T23:50:29.917 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:126: TEST_scrub_test: grep -vq -- +inconsistent 2026-03-08T23:50:29.918 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:126: TEST_scrub_test: grep '^1.0' 2026-03-08T23:50:30.072 INFO:tasks.workunit.client.0.vm03.stderr:dumped pgs 2026-03-08T23:50:30.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:127: TEST_scrub_test: perf_counters td/osd-scrub-test 3 2026-03-08T23:50:30.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:43: perf_counters: local dir=td/osd-scrub-test 2026-03-08T23:50:30.085 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:44: perf_counters: local OSDS=3 2026-03-08T23:50:30.085 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: expr 3 - 1 2026-03-08T23:50:30.086 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: seq 0 2 2026-03-08T23:50:30.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:50:30.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.0 counter dump 2026-03-08T23:50:30.087 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.155 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 1, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 1, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 1, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 1, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.008000008, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.008000008 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.156 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 2, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 1, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.157 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.158 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.164 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.164 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.1 counter dump 2026-03-08T23:50:30.165 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.236 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 3, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 3, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 3, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 3, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0.008000009, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0.002666669 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.237 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 6, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 3, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 2 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.238 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.239 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:50:30.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:45: perf_counters: for osd in $(seq 0 $(expr $OSDS - 1)) 2026-03-08T23:50:30.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: ceph tell osd.2 counter dump 2026-03-08T23:50:30.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:47: perf_counters: jq 'with_entries(select(.key | startswith("osd_scrub")))' 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_ec": [ 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_dp_repl": [ 2026-03-08T23:50:30.323 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "level": "deep", 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.324 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_ec": [ 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "ec" 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.325 INFO:tasks.workunit.client.0.vm03.stdout: ], 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "osd_scrub_sh_repl": [ 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: { 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "labels": { 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "level": "shallow", 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "pooltype": "replicated" 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "counters": { 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_started": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "num_scrubs_past_reservation": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "successful_scrubs_elapsed": { 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "failed_scrubs_elapsed": { 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "preemptions": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_selected": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "chunk_busy": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "locked_object": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "write_blocked_by_scrub": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "scrub_reservations_completed": 0, 2026-03-08T23:50:30.326 INFO:tasks.workunit.client.0.vm03.stdout: "successful_reservations_elapsed": { 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_aborted": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_failure": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "reservation_process_skipped": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "failed_reservations_elapsed": { 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "avgcount": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "sum": 0, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "avgtime": 0 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: }, 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: "replicas_in_reservation": 0 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: } 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout: ] 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Teardown Test TEST_scrub_test ------------------ 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:36: run: echo '-------------- Teardown Test TEST_scrub_test ------------------' 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:37: run: teardown td/osd-scrub-test 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:50:30.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:50:30.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:50:30.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:50:30.335 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:50:30.450 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:50:30.450 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:50:30.451 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:50:30.452 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:50:30.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:50:30.453 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:50:30.453 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:50:30.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:50:30.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:50:30.454 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.454 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:50:30.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:50:30.455 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:50:30.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:50:30.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:50:30.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stdout:-------------- Complete Test TEST_scrub_test ------------------ 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-scrub-test.sh:38: run: echo '-------------- Complete Test TEST_scrub_test ------------------' 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-scrub-test 0 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-scrub-test 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-scrub-test KILL 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:50:30.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:50:30.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:50:30.475 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:50:30.476 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:50:30.476 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:50:30.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:50:30.477 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:50:30.477 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:50:30.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:50:30.478 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:50:30.479 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.479 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:50:30.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T23:50:30.480 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-scrub-test 2026-03-08T23:50:30.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:50:30.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.481 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.475827 2026-03-08T23:50:30.481 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.475827 2026-03-08T23:50:30.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:50:30.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:50:30.482 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T23:50:30.482 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T23:50:30.482 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T23:50:30.531 INFO:tasks.workunit:Running workunit scrub/osd-unexpected-clone.sh... 2026-03-08T23:50:30.531 DEBUG:teuthology.orchestra.run.vm03:workunit test scrub/osd-unexpected-clone.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=569c3e99c9b32a51b4eaf08731c728f4513ed589 TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh 2026-03-08T23:50:30.580 INFO:tasks.workunit.client.0.vm03.stderr:stty: 'standard input': Inappropriate ioctl for device 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:+ PS4='${BASH_SOURCE[0]}:$LINENO: ${FUNCNAME[0]}: ' 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: export PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2370: main: PATH=.:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/sbin 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: export PYTHONWARNINGS=ignore 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2371: main: PYTHONWARNINGS=ignore 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: export CEPH_CONF=/dev/null 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2372: main: CEPH_CONF=/dev/null 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2373: main: unset CEPH_ARGS 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2375: main: local code 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2376: main: run td/osd-unexpected-clone 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:22: run: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:23: run: shift 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:25: run: export CEPH_MON=127.0.0.1:7144 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:25: run: CEPH_MON=127.0.0.1:7144 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:26: run: export CEPH_ARGS 2026-03-08T23:50:30.583 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:27: run: uuidgen 2026-03-08T23:50:30.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:27: run: CEPH_ARGS+='--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none ' 2026-03-08T23:50:30.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:28: run: CEPH_ARGS+='--mon-host=127.0.0.1:7144 ' 2026-03-08T23:50:30.584 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:30: run: export -n CEPH_CLI_TEST_DUP_COMMAND 2026-03-08T23:50:30.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:31: run: set 2026-03-08T23:50:30.584 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:31: run: sed -n -e 's/^\(TEST_[0-9a-z_]*\) .*/\1/p' 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:31: run: local funcs=TEST_recover_unexpected 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:32: run: for func in $funcs 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:33: run: setup td/osd-unexpected-clone 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:131: setup: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:132: setup: teardown td/osd-unexpected-clone 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-unexpected-clone KILL 2026-03-08T23:50:30.585 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:50:30.586 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:50:30.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:50:30.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:50:30.586 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:50:30.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:50:30.587 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:50:30.587 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:50:30.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:50:30.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:50:30.588 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:50:30.588 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:50:30.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:50:30.589 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:50:30.589 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:50:30.590 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-unexpected-clone 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.591 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.601329 2026-03-08T23:50:30.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:50:30.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:50:30.592 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:133: setup: mkdir -p td/osd-unexpected-clone 2026-03-08T23:50:30.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: get_asok_dir 2026-03-08T23:50:30.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.593 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.593 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:134: setup: mkdir -p /tmp/ceph-asok.601329 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: ulimit -n 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:135: setup: '[' 1024 -le 1024 ']' 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:136: setup: ulimit -n 4096 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:138: setup: '[' -z '' ']' 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:139: setup: trap 'teardown td/osd-unexpected-clone 1' TERM HUP INT 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:34: run: TEST_recover_unexpected td/osd-unexpected-clone 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:40: TEST_recover_unexpected: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:42: TEST_recover_unexpected: run_mon td/osd-unexpected-clone a 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:448: run_mon: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:449: run_mon: shift 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:450: run_mon: local id=a 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:451: run_mon: shift 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:452: run_mon: local data=td/osd-unexpected-clone/a 2026-03-08T23:50:30.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:455: run_mon: ceph-mon --id a --mkfs --mon-data=td/osd-unexpected-clone/a --run-dir=td/osd-unexpected-clone 2026-03-08T23:50:30.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: get_asok_path 2026-03-08T23:50:30.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:30.615 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:30.616 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:30.616 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.616 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.616 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:30.616 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:462: run_mon: ceph-mon --id a --osd-failsafe-full-ratio=.99 --mon-osd-full-ratio=.99 --mon-data-avail-crit=1 --mon-data-avail-warn=5 --paxos-propose-interval=0.1 --osd-crush-chooseleaf-type=0 --debug-mon 20 --debug-ms 20 --debug-paxos 20 --chdir= --mon-data=td/osd-unexpected-clone/a '--log-file=td/osd-unexpected-clone/$name.log' '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --mon-cluster-log-file=td/osd-unexpected-clone/log --run-dir=td/osd-unexpected-clone '--pid-file=td/osd-unexpected-clone/$name.pid' --mon-allow-pool-delete --mon-allow-pool-size-one --osd-pool-default-pg-autoscale-mode off --mon-osd-backfillfull-ratio .99 --mon-warn-on-insecure-global-id-reclaim-allowed=false 2026-03-08T23:50:30.639 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: cat 2026-03-08T23:50:30.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a fsid 2026-03-08T23:50:30.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:50:30.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:50:30.639 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=fsid 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.601329/ceph-mon.a.asok 2026-03-08T23:50:30.640 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .fsid 2026-03-08T23:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:50:30.641 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.601329/ceph-mon.a.asok config get fsid 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:487: run_mon: get_config mon a mon_host 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1125: get_config: local daemon=mon 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1126: get_config: local id=a 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1127: get_config: local config=mon_host 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1132: get_config: jq -r .mon_host 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: get_asok_path mon.a 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name=mon.a 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n mon.a ']' 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: get_asok_dir 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:118: get_asok_path: echo /tmp/ceph-asok.601329/ceph-mon.a.asok 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: CEPH_ARGS= 2026-03-08T23:50:30.707 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1129: get_config: ceph --format json daemon /tmp/ceph-asok.601329/ceph-mon.a.asok config get mon_host 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:43: TEST_recover_unexpected: run_mgr td/osd-unexpected-clone x 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:553: run_mgr: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:554: run_mgr: shift 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:555: run_mgr: local id=x 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:556: run_mgr: shift 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:557: run_mgr: local data=td/osd-unexpected-clone/x 2026-03-08T23:50:30.774 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:559: run_mgr: ceph config set mgr mgr_pool false --force 2026-03-08T23:50:30.882 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: get_asok_path 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:30.883 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: realpath /home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:50:30.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:561: run_mgr: ceph-mgr --id x --osd-failsafe-full-ratio=.99 --debug-mgr 20 --debug-objecter 20 --debug-ms 20 --debug-paxos 20 --chdir= --mgr-data=td/osd-unexpected-clone/x '--log-file=td/osd-unexpected-clone/$name.log' '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --run-dir=td/osd-unexpected-clone '--pid-file=td/osd-unexpected-clone/$name.pid' --mgr-module-path=/home/ubuntu/cephtest/clone.client.0/src/pybind/mgr 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:44: TEST_recover_unexpected: run_osd td/osd-unexpected-clone 0 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=0 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-unexpected-clone/0 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/0' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/0/journal' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:30.902 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:50:30.903 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-unexpected-clone/0 2026-03-08T23:50:30.909 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:50:30.910 INFO:tasks.workunit.client.0.vm03.stdout:add osd0 0c08215e-42b8-43a2-b894-17e6af78232b 2026-03-08T23:50:30.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=0c08215e-42b8-43a2-b894-17e6af78232b 2026-03-08T23:50:30.910 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd0 0c08215e-42b8-43a2-b894-17e6af78232b' 2026-03-08T23:50:30.910 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:50:30.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBGC65p4UyNNxAAAu/jApG57y7gs6EKXuX+QQ== 2026-03-08T23:50:30.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBGC65p4UyNNxAAAu/jApG57y7gs6EKXuX+QQ=="}' 2026-03-08T23:50:30.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 0c08215e-42b8-43a2-b894-17e6af78232b -i td/osd-unexpected-clone/0/new.json 2026-03-08T23:50:31.032 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:50:31.046 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-unexpected-clone/0/new.json 2026-03-08T23:50:31.047 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 0 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/0 --osd-journal=td/osd-unexpected-clone/0/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBGC65p4UyNNxAAAu/jApG57y7gs6EKXuX+QQ== --osd-uuid 0c08215e-42b8-43a2-b894-17e6af78232b 2026-03-08T23:50:31.061 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:31.069+0000 7f1890b128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:31.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:31.073+0000 7f1890b128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:31.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:31.073+0000 7f1890b128c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:31.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:31.073+0000 7f1890b128c0 -1 bdev(0x562c5b749c00 td/osd-unexpected-clone/0/block) open stat got: (1) Operation not permitted 2026-03-08T23:50:31.068 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:31.073+0000 7f1890b128c0 -1 bluestore(td/osd-unexpected-clone/0) _read_fsid unparsable uuid 2026-03-08T23:50:33.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-unexpected-clone/0/keyring 2026-03-08T23:50:33.325 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:50:33.326 INFO:tasks.workunit.client.0.vm03.stdout:adding osd0 key to auth repository 2026-03-08T23:50:33.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd0 key to auth repository 2026-03-08T23:50:33.326 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-unexpected-clone/0/keyring auth add osd.0 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:50:33.438 INFO:tasks.workunit.client.0.vm03.stdout:start osd.0 2026-03-08T23:50:33.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.0 2026-03-08T23:50:33.438 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:50:33.439 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:50:33.449 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:50:33.452 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 0 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/0 --osd-journal=td/osd-unexpected-clone/0/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:50:33.490 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:33.497+0000 7f3ae3c4a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:33.500 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:33.505+0000 7f3ae3c4a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:33.504 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:33.509+0000 7f3ae3c4a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 0 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=0 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:33.653 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:50:33.816 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:33.965 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:33.973+0000 7f3ae3c4a8c0 -1 Falling back to public interface 2026-03-08T23:50:34.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:34.817 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:34.818 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:50:34.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:50:34.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:34.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:50:34.980 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:35.185 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:35.193+0000 7f3ae3c4a8c0 -1 osd.0 0 log_to_monitors true 2026-03-08T23:50:35.981 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:35.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:35.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:50:35.982 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:50:35.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:35.982 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:50:36.173 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:36.219 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:36.225+0000 7f3adf403640 -1 osd.0 0 waiting for initial osdmap 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:37.175 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.0 up' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stdout:osd.0 up in weight 1 up_from 5 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6802/1681036912,v1:127.0.0.1:6803/1681036912] [v2:127.0.0.1:6804/1681036912,v1:127.0.0.1:6805/1681036912] exists,up 0c08215e-42b8-43a2-b894-17e6af78232b 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:45: TEST_recover_unexpected: run_osd td/osd-unexpected-clone 1 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=1 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-unexpected-clone/1 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/1' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/1/journal' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:37.343 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:50:37.344 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-unexpected-clone/1 2026-03-08T23:50:37.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:50:37.345 INFO:tasks.workunit.client.0.vm03.stdout:add osd1 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:50:37.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:50:37.345 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd1 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14' 2026-03-08T23:50:37.345 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:50:37.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBNC65psygBFhAA6gOflbG/CgXK2tOi9Ve1QQ== 2026-03-08T23:50:37.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBNC65psygBFhAA6gOflbG/CgXK2tOi9Ve1QQ=="}' 2026-03-08T23:50:37.358 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 -i td/osd-unexpected-clone/1/new.json 2026-03-08T23:50:37.507 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:50:37.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-unexpected-clone/1/new.json 2026-03-08T23:50:37.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 1 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/1 --osd-journal=td/osd-unexpected-clone/1/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBNC65psygBFhAA6gOflbG/CgXK2tOi9Ve1QQ== --osd-uuid 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:50:37.534 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:37.541+0000 7faab61488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:37.536 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:37.545+0000 7faab61488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:37.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:37.545+0000 7faab61488c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:37.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:37.545+0000 7faab61488c0 -1 bdev(0x55efb6509c00 td/osd-unexpected-clone/1/block) open stat got: (1) Operation not permitted 2026-03-08T23:50:37.537 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:37.545+0000 7faab61488c0 -1 bluestore(td/osd-unexpected-clone/1) _read_fsid unparsable uuid 2026-03-08T23:50:39.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-unexpected-clone/1/keyring 2026-03-08T23:50:39.845 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:50:39.846 INFO:tasks.workunit.client.0.vm03.stdout:adding osd1 key to auth repository 2026-03-08T23:50:39.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd1 key to auth repository 2026-03-08T23:50:39.846 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-unexpected-clone/1/keyring auth add osd.1 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:50:40.050 INFO:tasks.workunit.client.0.vm03.stdout:start osd.1 2026-03-08T23:50:40.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.1 2026-03-08T23:50:40.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 1 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/1 --osd-journal=td/osd-unexpected-clone/1/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:50:40.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:50:40.051 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:50:40.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:50:40.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:40.073+0000 7fc4c21ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:40.066 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:40.073+0000 7fc4c21ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:40.067 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:40.073+0000 7fc4c21ea8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 1 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:40.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:40.395 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:41.396 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:41.513 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:41.521+0000 7fc4c21ea8c0 -1 Falling back to public interface 2026-03-08T23:50:41.562 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:42.477 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:42.485+0000 7fc4c21ea8c0 -1 osd.1 0 log_to_monitors true 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:42.563 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:42.769 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:43.770 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:50:43.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:43.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:43.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:50:43.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:43.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:43.947 INFO:tasks.workunit.client.0.vm03.stdout:osd.1 up in weight 1 up_from 10 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6810/2355903499,v1:127.0.0.1:6811/2355903499] [v2:127.0.0.1:6812/2355903499,v1:127.0.0.1:6813/2355903499] exists,up 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:46: TEST_recover_unexpected: run_osd td/osd-unexpected-clone 2 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:633: run_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:634: run_osd: shift 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:635: run_osd: local id=2 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:636: run_osd: shift 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:637: run_osd: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:639: run_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:640: run_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:641: run_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:642: run_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:643: run_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/2' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:644: run_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/2/journal' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:645: run_osd: ceph_args+=' --chdir=' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:646: run_osd: ceph_args+= 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:647: run_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: get_asok_path 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:43.948 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:648: run_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:649: run_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:650: run_osd: ceph_args+=' --debug-ms=1' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:651: run_osd: ceph_args+=' --debug-monc=20' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:652: run_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:653: run_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:654: run_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:655: run_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:656: run_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:657: run_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:658: run_osd: ceph_args+=' ' 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:659: run_osd: ceph_args+= 2026-03-08T23:50:43.949 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:660: run_osd: mkdir -p td/osd-unexpected-clone/2 2026-03-08T23:50:43.950 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: uuidgen 2026-03-08T23:50:43.951 INFO:tasks.workunit.client.0.vm03.stdout:add osd2 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:50:43.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:662: run_osd: local uuid=5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:50:43.951 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:663: run_osd: echo 'add osd2 5b7f726b-4b65-4cac-9807-0f70247440da' 2026-03-08T23:50:43.951 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: ceph-authtool --gen-print-key 2026-03-08T23:50:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:664: run_osd: OSD_SECRET=AQBTC65p7wsWOhAAoNBRzaOPp4fmM5yBjediTg== 2026-03-08T23:50:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:665: run_osd: echo '{"cephx_secret": "AQBTC65p7wsWOhAAoNBRzaOPp4fmM5yBjediTg=="}' 2026-03-08T23:50:43.963 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:666: run_osd: ceph osd new 5b7f726b-4b65-4cac-9807-0f70247440da -i td/osd-unexpected-clone/2/new.json 2026-03-08T23:50:44.129 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:50:44.144 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:667: run_osd: rm td/osd-unexpected-clone/2/new.json 2026-03-08T23:50:44.145 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:668: run_osd: ceph-osd -i 2 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/2 --osd-journal=td/osd-unexpected-clone/2/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops --mkfs --key AQBTC65p7wsWOhAAoNBRzaOPp4fmM5yBjediTg== --osd-uuid 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:50:44.160 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:44.169+0000 7fa8947668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:44.162 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:44.169+0000 7fa8947668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:44.163 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:44.169+0000 7fa8947668c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:44.163 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:44.169+0000 7fa8947668c0 -1 bdev(0x5640bcc59c00 td/osd-unexpected-clone/2/block) open stat got: (1) Operation not permitted 2026-03-08T23:50:44.163 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:44.169+0000 7fa8947668c0 -1 bluestore(td/osd-unexpected-clone/2) _read_fsid unparsable uuid 2026-03-08T23:50:46.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:670: run_osd: local key_fn=td/osd-unexpected-clone/2/keyring 2026-03-08T23:50:46.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:671: run_osd: cat 2026-03-08T23:50:46.445 INFO:tasks.workunit.client.0.vm03.stdout:adding osd2 key to auth repository 2026-03-08T23:50:46.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:675: run_osd: echo adding osd2 key to auth repository 2026-03-08T23:50:46.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:676: run_osd: ceph -i td/osd-unexpected-clone/2/keyring auth add osd.2 osd 'allow *' mon 'allow profile osd' mgr 'allow profile osd' 2026-03-08T23:50:46.648 INFO:tasks.workunit.client.0.vm03.stdout:start osd.2 2026-03-08T23:50:46.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:677: run_osd: echo start osd.2 2026-03-08T23:50:46.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: ceph osd dump --format=json 2026-03-08T23:50:46.648 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: grep -q '"noup"' 2026-03-08T23:50:46.649 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:681: run_osd: jq '.flags_set[]' 2026-03-08T23:50:46.652 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:678: run_osd: ceph-osd -i 2 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/2 --osd-journal=td/osd-unexpected-clone/2/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 --debug-ms=1 --debug-monc=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:50:46.680 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:46.681+0000 7ff2066198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:46.684 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:46.693+0000 7ff2066198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:46.686 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:46.693+0000 7ff2066198c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:684: run_osd: wait_for_osd up 2 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stdout:0 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:46.820 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:50:46.988 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:47.137 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:47.145+0000 7ff2066198c0 -1 Falling back to public interface 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stdout:1 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:47.989 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:50:48.151 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:48.592 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:48.597+0000 7ff2066198c0 -1 osd.2 0 log_to_monitors true 2026-03-08T23:50:49.152 INFO:tasks.workunit.client.0.vm03.stdout:2 2026-03-08T23:50:49.152 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:49.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:49.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:50:49.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:49.153 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:50:49.328 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:49.842 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:49.849+0000 7ff201dd2640 -1 osd.2 0 waiting for initial osdmap 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stdout:3 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:50.329 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:50:50.495 INFO:tasks.workunit.client.0.vm03.stdout:osd.2 up in weight 1 up_from 15 up_thru 0 down_at 0 last_clean_interval [0,0) [v2:127.0.0.1:6818/748278127,v1:127.0.0.1:6819/748278127] [v2:127.0.0.1:6820/748278127,v1:127.0.0.1:6821/748278127] exists,up 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:50:50.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:50:50.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:50:50.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:50:50.496 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:48: TEST_recover_unexpected: ceph osd pool create foo 1 2026-03-08T23:50:50.884 INFO:tasks.workunit.client.0.vm03.stderr:pool 'foo' created 2026-03-08T23:50:50.897 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:49: TEST_recover_unexpected: rados -p foo put foo /etc/passwd 2026-03-08T23:50:52.172 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:50: TEST_recover_unexpected: rados -p foo mksnap snap 2026-03-08T23:50:52.221 INFO:tasks.workunit.client.0.vm03.stdout:created pool foo snap snap 2026-03-08T23:50:52.223 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:51: TEST_recover_unexpected: rados -p foo put foo /etc/group 2026-03-08T23:50:52.247 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:53: TEST_recover_unexpected: wait_for_clean 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:52.248 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:52.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:52.249 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:50:52.304 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:52.469 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:50:52.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836485 2026-03-08T23:50:52.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836485 2026-03-08T23:50:52.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485' 2026-03-08T23:50:52.551 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:52.551 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:50:52.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=42949672963 2026-03-08T23:50:52.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 42949672963 2026-03-08T23:50:52.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963' 2026-03-08T23:50:52.629 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:52.629 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:50:52.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509442 2026-03-08T23:50:52.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509442 2026-03-08T23:50:52.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836485 1-42949672963 2-64424509442' 2026-03-08T23:50:52.703 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:52.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836485 2026-03-08T23:50:52.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:52.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:50:52.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836485 2026-03-08T23:50:52.705 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:52.706 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.0 seq 21474836485 2026-03-08T23:50:52.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836485 2026-03-08T23:50:52.706 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836485' 2026-03-08T23:50:52.706 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:52.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836484 -lt 21474836485 2026-03-08T23:50:52.867 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:50:53.868 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:50:53.868 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:50:54.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836485 -lt 21474836485 2026-03-08T23:50:54.035 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:54.035 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-42949672963 2026-03-08T23:50:54.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:54.037 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:50:54.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-42949672963 2026-03-08T23:50:54.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:54.038 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.1 seq 42949672963 2026-03-08T23:50:54.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=42949672963 2026-03-08T23:50:54.039 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 42949672963' 2026-03-08T23:50:54.039 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:50:54.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 42949672963 -lt 42949672963 2026-03-08T23:50:54.206 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:50:54.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509442 2026-03-08T23:50:54.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:50:54.207 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:50:54.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509442 2026-03-08T23:50:54.208 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:50:54.209 INFO:tasks.workunit.client.0.vm03.stdout:waiting osd.2 seq 64424509442 2026-03-08T23:50:54.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509442 2026-03-08T23:50:54.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509442' 2026-03-08T23:50:54.209 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:50:54.374 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509442 -lt 64424509442 2026-03-08T23:50:54.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:50:54.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:54.374 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:50:54.566 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:50:54.733 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:50:54.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:50:54.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:50:54.733 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:50:54.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:50:54.929 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:55: TEST_recover_unexpected: get_primary foo foo 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1196: get_primary: local poolname=foo 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1197: get_primary: local objectname=foo 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1199: get_primary: ceph --format json osd map foo foo 2026-03-08T23:50:54.930 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1200: get_primary: jq .acting_primary 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:55: TEST_recover_unexpected: local osd=1 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:57: TEST_recover_unexpected: objectstore_tool td/osd-unexpected-clone 1 --op list foo 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:57: TEST_recover_unexpected: grep snapid.:1 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-unexpected-clone 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-unexpected-clone 1 --op list foo 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-unexpected-clone 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:50:55.101 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-unexpected-clone TERM osd.1 2026-03-08T23:50:55.102 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:50:55.102 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:50:55.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:50:55.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:50:55.102 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-unexpected-clone 1 --op list foo 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-unexpected-clone 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-unexpected-clone/1 2026-03-08T23:50:55.207 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-unexpected-clone/1 --op list foo 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-unexpected-clone 1 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-unexpected-clone/1 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/1' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/1/journal' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:50:55.824 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-unexpected-clone/1 2026-03-08T23:50:55.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:50:55.825 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:50:55.825 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/1 --osd-journal=td/osd-unexpected-clone/1/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:50:55.825 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-unexpected-clone/1/whoami 2026-03-08T23:50:55.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:50:55.826 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:50:55.827 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:50:55.836 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:50:55.840 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:55.845+0000 7fc5d51858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:55.841 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:55.849+0000 7fc5d51858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:55.843 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:55.849+0000 7fc5d51858c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:56.006 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:56.171 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:57.037 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:57.045+0000 7fc5d51858c0 -1 Falling back to public interface 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:57.172 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:57.332 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:58.042 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:50:58.049+0000 7fc5d51858c0 -1 osd.1 19 log_to_monitors true 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:58.334 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:58.532 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:50:59.533 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 23 up_thru 23 down_at 20 last_clean_interval [10,19) [v2:127.0.0.1:6810/807832744,v1:127.0.0.1:6811/807832744] [v2:127.0.0.1:6812/807832744,v1:127.0.0.1:6813/807832744] exists,up 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:50:59.710 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:////home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:50:59.711 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:50:59.776 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:50:59.940 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:51:00.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836487 2026-03-08T23:51:00.036 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836487 2026-03-08T23:51:00.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487' 2026-03-08T23:51:00.037 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:00.037 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:51:00.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=98784247810 2026-03-08T23:51:00.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 98784247810 2026-03-08T23:51:00.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-98784247810' 2026-03-08T23:51:00.116 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:00.116 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:51:00.194 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509444 2026-03-08T23:51:00.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509444 2026-03-08T23:51:00.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836487 1-98784247810 2-64424509444' 2026-03-08T23:51:00.195 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:00.195 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836487 2026-03-08T23:51:00.195 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:00.196 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:51:00.196 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836487 2026-03-08T23:51:00.196 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:00.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836487 2026-03-08T23:51:00.197 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836487' 2026-03-08T23:51:00.197 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836487 2026-03-08T23:51:00.197 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:00.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836486 -lt 21474836487 2026-03-08T23:51:00.361 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:01.362 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:51:01.362 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:01.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836488 -lt 21474836487 2026-03-08T23:51:01.521 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:01.521 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-98784247810 2026-03-08T23:51:01.521 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:01.522 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:51:01.522 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-98784247810 2026-03-08T23:51:01.522 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:01.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=98784247810 2026-03-08T23:51:01.523 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 98784247810' 2026-03-08T23:51:01.523 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 98784247810 2026-03-08T23:51:01.523 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:01.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 98784247810 -lt 98784247810 2026-03-08T23:51:01.684 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:01.684 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509444 2026-03-08T23:51:01.684 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:01.685 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:51:01.685 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509444 2026-03-08T23:51:01.686 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:01.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509444 2026-03-08T23:51:01.686 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509444' 2026-03-08T23:51:01.686 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509444 2026-03-08T23:51:01.686 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:01.846 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509445 -lt 64424509444 2026-03-08T23:51:01.846 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:51:01.847 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:01.847 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:51:02.044 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:51:02.216 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:51:02.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:51:02.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:02.216 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:02.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:51:02.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:51:02.412 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:51:02.412 INFO:tasks.workunit.client.0.vm03.stdout:JSON is ["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}] 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:57: TEST_recover_unexpected: JSON='["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:58: TEST_recover_unexpected: echo 'JSON is ["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:59: TEST_recover_unexpected: rm -f td/osd-unexpected-clone/_ td/osd-unexpected-clone/data 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:60: TEST_recover_unexpected: objectstore_tool td/osd-unexpected-clone 1 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-attr _ 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-unexpected-clone 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=1 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-unexpected-clone 1 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-attr _ 2026-03-08T23:51:02.413 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-unexpected-clone 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=1 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-unexpected-clone TERM osd.1 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:02.414 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:02.519 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-unexpected-clone 1 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-attr _ 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=1 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-unexpected-clone/1 2026-03-08T23:51:02.520 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-unexpected-clone/1 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-attr _ 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-unexpected-clone 1 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=1 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-unexpected-clone/1 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/1' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/1/journal' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:03.140 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:51:03.141 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-unexpected-clone/1 2026-03-08T23:51:03.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.1 2026-03-08T23:51:03.142 INFO:tasks.workunit.client.0.vm03.stderr:start osd.1 2026-03-08T23:51:03.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 1 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/1 --osd-journal=td/osd-unexpected-clone/1/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:51:03.142 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-unexpected-clone/1/whoami 2026-03-08T23:51:03.142 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 1 = 1 ']' 2026-03-08T23:51:03.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:51:03.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:51:03.147 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:51:03.156 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:03.161+0000 7f7ace3ae8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:03.160 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:03.169+0000 7f7ace3ae8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:03.161 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:03.169+0000 7f7ace3ae8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 1 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=1 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:51:03.322 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:51:03.323 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:51:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:03.323 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:51:03.514 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:04.125 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:04.133+0000 7f7ace3ae8c0 -1 Falling back to public interface 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:04.515 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:51:04.676 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:05.408 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:05.417+0000 7f7ace3ae8c0 -1 osd.1 24 log_to_monitors true 2026-03-08T23:51:05.677 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:05.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:05.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:51:05.678 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:51:05.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:05.678 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:51:05.843 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:06.844 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.1 up' 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:osd.1 up in weight 1 up_from 28 up_thru 28 down_at 25 last_clean_interval [23,24) [v2:127.0.0.1:6810/3123731435,v1:127.0.0.1:6811/3123731435] [v2:127.0.0.1:6812/3123731435,v1:127.0.0.1:6813/3123731435] exists,up 83e2a4d5-c8bb-4d32-9c95-b4dde6522e14 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:51:07.008 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:51:07.066 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:51:07.221 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:51:07.222 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:07.222 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:51:07.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:51:07.222 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:07.222 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:51:07.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836490 2026-03-08T23:51:07.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836490 2026-03-08T23:51:07.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490' 2026-03-08T23:51:07.296 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:07.296 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:51:07.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084290 2026-03-08T23:51:07.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084290 2026-03-08T23:51:07.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-120259084290' 2026-03-08T23:51:07.370 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:07.370 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:51:07.444 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=64424509447 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 64424509447 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836490 1-120259084290 2-64424509447' 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836490 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:07.445 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:51:07.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836490 2026-03-08T23:51:07.446 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:07.446 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836490 2026-03-08T23:51:07.447 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836490' 2026-03-08T23:51:07.447 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836490 2026-03-08T23:51:07.447 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:07.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836490 -lt 21474836490 2026-03-08T23:51:07.609 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:07.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084290 2026-03-08T23:51:07.609 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:07.610 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:51:07.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084290 2026-03-08T23:51:07.611 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:07.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084290 2026-03-08T23:51:07.612 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084290' 2026-03-08T23:51:07.612 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 120259084290 2026-03-08T23:51:07.612 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:07.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084290 -lt 120259084290 2026-03-08T23:51:07.778 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:07.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-64424509447 2026-03-08T23:51:07.778 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:07.779 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:51:07.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-64424509447 2026-03-08T23:51:07.779 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:07.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=64424509447 2026-03-08T23:51:07.780 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 64424509447' 2026-03-08T23:51:07.780 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 64424509447 2026-03-08T23:51:07.780 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:07.940 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 64424509447 -lt 64424509447 2026-03-08T23:51:07.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:51:07.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:07.940 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:51:08.143 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:51:08.316 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:51:08.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:51:08.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:08.316 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:61: TEST_recover_unexpected: objectstore_tool td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-unexpected-clone 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-unexpected-clone 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-unexpected-clone TERM osd.2 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:08.510 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:08.614 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-unexpected-clone/2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' get-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-unexpected-clone 2 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/2' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/2/journal' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:51:09.224 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:51:09.225 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-unexpected-clone/2 2026-03-08T23:51:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:51:09.226 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:51:09.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/2 --osd-journal=td/osd-unexpected-clone/2/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:51:09.226 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-unexpected-clone/2/whoami 2026-03-08T23:51:09.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:51:09.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:51:09.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:51:09.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:51:09.249 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:09.249+0000 7fbe0df5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:09.250 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:09.257+0000 7fbe0df5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:09.252 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:09.257+0000 7fbe0df5a8c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:09.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:09.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:10.445 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:10.453+0000 7fbe0df5a8c0 -1 Falling back to public interface 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:10.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:10.722 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:11.451 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:11.457+0000 7fbe0df5a8c0 -1 osd.2 29 log_to_monitors true 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:11.723 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:11.893 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:12.392 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:12.397+0000 7fbe04f0a640 -1 osd.2 29 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:12.895 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:13.056 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 33 up_thru 0 down_at 30 last_clean_interval [15,29) [v2:127.0.0.1:6818/1025308889,v1:127.0.0.1:6819/1025308889] [v2:127.0.0.1:6820/1025308889,v1:127.0.0.1:6821/1025308889] exists,up 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:51:13.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:51:13.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:51:13.057 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:51:13.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:51:13.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:51:13.058 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:51:13.114 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:13.274 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:51:13.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836492 2026-03-08T23:51:13.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836492 2026-03-08T23:51:13.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492' 2026-03-08T23:51:13.348 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:13.348 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:51:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084292 2026-03-08T23:51:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084292 2026-03-08T23:51:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-120259084292' 2026-03-08T23:51:13.423 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:13.423 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:51:13.501 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=141733920770 2026-03-08T23:51:13.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 141733920770 2026-03-08T23:51:13.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836492 1-120259084292 2-141733920770' 2026-03-08T23:51:13.502 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:13.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836492 2026-03-08T23:51:13.502 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:13.503 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:51:13.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836492 2026-03-08T23:51:13.503 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:13.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836492 2026-03-08T23:51:13.505 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836492' 2026-03-08T23:51:13.505 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836492 2026-03-08T23:51:13.505 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:13.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836492 -lt 21474836492 2026-03-08T23:51:13.679 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:13.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084292 2026-03-08T23:51:13.679 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:13.680 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:51:13.680 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084292 2026-03-08T23:51:13.681 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:13.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084292 2026-03-08T23:51:13.682 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084292' 2026-03-08T23:51:13.682 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 120259084292 2026-03-08T23:51:13.682 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:13.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084292 -lt 120259084292 2026-03-08T23:51:13.849 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:13.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-141733920770 2026-03-08T23:51:13.849 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:13.850 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:51:13.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-141733920770 2026-03-08T23:51:13.850 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:13.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=141733920770 2026-03-08T23:51:13.851 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 141733920770' 2026-03-08T23:51:13.851 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 141733920770 2026-03-08T23:51:13.851 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 141733920770 2026-03-08T23:51:14.016 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:15.017 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:51:15.017 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:15.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 1 -lt 141733920770 2026-03-08T23:51:15.178 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:16.179 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:51:16.179 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:16.338 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 141733920770 -lt 141733920770 2026-03-08T23:51:16.338 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:51:16.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:16.339 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:16.530 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:51:16.531 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:51:16.687 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:51:16.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:51:16.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:16.687 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:16.884 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:51:16.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:51:16.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:51:16.885 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:63: TEST_recover_unexpected: rados -p foo rmsnap snap 2026-03-08T23:51:16.953 INFO:tasks.workunit.client.0.vm03.stdout:removed pool foo snap snap 2026-03-08T23:51:16.955 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:65: TEST_recover_unexpected: sleep 5 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:67: TEST_recover_unexpected: objectstore_tool td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-unexpected-clone 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-unexpected-clone 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-unexpected-clone TERM osd.2 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:21.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:21.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:22.061 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:22.062 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-unexpected-clone/2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-bytes td/osd-unexpected-clone/data 2026-03-08T23:51:23.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-unexpected-clone 2 2026-03-08T23:51:23.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:51:23.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:51:23.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:51:23.251 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/2' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/2/journal' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:51:23.252 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-unexpected-clone/2 2026-03-08T23:51:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:51:23.253 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:51:23.253 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/2 --osd-journal=td/osd-unexpected-clone/2/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:51:23.253 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-unexpected-clone/2/whoami 2026-03-08T23:51:23.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:51:23.254 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:51:23.255 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:51:23.264 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:51:23.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:23.273+0000 7f14a70578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:23.268 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:23.277+0000 7f14a70578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:23.270 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:23.277+0000 7f14a70578c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:23.432 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:23.599 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:24.473 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:24.481+0000 7f14a70578c0 -1 Falling back to public interface 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:24.600 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:24.756 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:25.460 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:25.465+0000 7f14a70578c0 -1 osd.2 35 log_to_monitors true 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:25.757 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:25.932 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:26.434 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:26.441+0000 7f149e007640 -1 osd.2 35 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:26.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:27.094 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 40 up_thru 0 down_at 36 last_clean_interval [33,35) [v2:127.0.0.1:6818/1407014952,v1:127.0.0.1:6819/1407014952] [v2:127.0.0.1:6820/1407014952,v1:127.0.0.1:6821/1407014952] exists,up 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:51:27.095 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:51:27.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:51:27.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:51:27.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:51:27.158 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:51:27.159 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:51:27.159 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:27.318 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:51:27.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836496 2026-03-08T23:51:27.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836496 2026-03-08T23:51:27.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496' 2026-03-08T23:51:27.394 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:27.394 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:51:27.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084296 2026-03-08T23:51:27.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084296 2026-03-08T23:51:27.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-120259084296' 2026-03-08T23:51:27.471 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:27.471 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=171798691842 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 171798691842 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836496 1-120259084296 2-171798691842' 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836496 2026-03-08T23:51:27.553 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:27.554 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:51:27.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836496 2026-03-08T23:51:27.554 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:27.555 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836496 2026-03-08T23:51:27.556 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836496' 2026-03-08T23:51:27.556 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836496 2026-03-08T23:51:27.556 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836496 -lt 21474836496 2026-03-08T23:51:27.717 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:27.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084296 2026-03-08T23:51:27.718 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:27.718 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:51:27.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084296 2026-03-08T23:51:27.719 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:27.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084296 2026-03-08T23:51:27.720 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084296' 2026-03-08T23:51:27.720 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 120259084296 2026-03-08T23:51:27.720 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:27.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084295 -lt 120259084296 2026-03-08T23:51:27.890 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:28.891 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:51:28.891 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:29.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084295 -lt 120259084296 2026-03-08T23:51:29.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:30.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 299 -eq 0 ']' 2026-03-08T23:51:30.056 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:30.226 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084296 -lt 120259084296 2026-03-08T23:51:30.227 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:30.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-171798691842 2026-03-08T23:51:30.227 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:30.228 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:51:30.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-171798691842 2026-03-08T23:51:30.228 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:30.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=171798691842 2026-03-08T23:51:30.229 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 171798691842' 2026-03-08T23:51:30.229 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 171798691842 2026-03-08T23:51:30.229 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:30.400 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 171798691842 -lt 171798691842 2026-03-08T23:51:30.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:51:30.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:30.400 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:51:30.607 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:51:30.608 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:51:30.771 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:51:30.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:51:30.771 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:30.772 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:68: TEST_recover_unexpected: objectstore_tool td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-attr _ td/osd-unexpected-clone/_ 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1297: objectstore_tool: local dir=td/osd-unexpected-clone 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1298: objectstore_tool: shift 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1299: objectstore_tool: local id=2 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1300: objectstore_tool: shift 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1302: objectstore_tool: _objectstore_tool_nowait td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-attr _ td/osd-unexpected-clone/_ 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1269: _objectstore_tool_nowait: local dir=td/osd-unexpected-clone 2026-03-08T23:51:30.969 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1270: _objectstore_tool_nowait: shift 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1271: _objectstore_tool_nowait: local id=2 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1272: _objectstore_tool_nowait: shift 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1274: _objectstore_tool_nowait: kill_daemons td/osd-unexpected-clone TERM osd.2 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:30.970 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1276: _objectstore_tool_nowait: _objectstore_tool_nodown td/osd-unexpected-clone 2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-attr _ td/osd-unexpected-clone/_ 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1257: _objectstore_tool_nodown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1258: _objectstore_tool_nodown: shift 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1259: _objectstore_tool_nodown: local id=2 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1260: _objectstore_tool_nodown: shift 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1261: _objectstore_tool_nodown: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:31.276 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1264: _objectstore_tool_nodown: ceph-objectstore-tool --data-path td/osd-unexpected-clone/2 '["1.0",{"oid":"foo","key":"","snapid":1,"hash":2143417350,"max":0,"pool":1,"namespace":"","max":0}]' set-attr _ td/osd-unexpected-clone/_ 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1277: _objectstore_tool_nowait: activate_osd td/osd-unexpected-clone 2 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:846: activate_osd: local dir=td/osd-unexpected-clone 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:847: activate_osd: shift 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:848: activate_osd: local id=2 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:849: activate_osd: shift 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:850: activate_osd: local osd_data=td/osd-unexpected-clone/2 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:852: activate_osd: local 'ceph_args=--fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 ' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:853: activate_osd: ceph_args+=' --osd-failsafe-full-ratio=.99' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:854: activate_osd: ceph_args+=' --osd-journal-size=100' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:855: activate_osd: ceph_args+=' --osd-scrub-load-threshold=2000' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:856: activate_osd: ceph_args+=' --osd-data=td/osd-unexpected-clone/2' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:857: activate_osd: ceph_args+=' --osd-journal=td/osd-unexpected-clone/2/journal' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:858: activate_osd: ceph_args+=' --chdir=' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:859: activate_osd: ceph_args+= 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:860: activate_osd: ceph_args+=' --run-dir=td/osd-unexpected-clone' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: get_asok_path 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:116: get_asok_path: local name= 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:117: get_asok_path: '[' -n '' ']' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: get_asok_dir 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:120: get_asok_path: echo '/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:861: activate_osd: ceph_args+=' --admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:862: activate_osd: ceph_args+=' --debug-osd=20' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:863: activate_osd: ceph_args+=' --log-file=td/osd-unexpected-clone/$name.log' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:864: activate_osd: ceph_args+=' --pid-file=td/osd-unexpected-clone/$name.pid' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:865: activate_osd: ceph_args+=' --osd-max-object-name-len=460' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:866: activate_osd: ceph_args+=' --osd-max-object-namespace-len=64' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:867: activate_osd: ceph_args+=' --enable-experimental-unrecoverable-data-corrupting-features=*' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:868: activate_osd: ceph_args+=' --osd-mclock-profile=high_recovery_ops' 2026-03-08T23:51:32.472 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:869: activate_osd: ceph_args+=' ' 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:870: activate_osd: ceph_args+= 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:871: activate_osd: mkdir -p td/osd-unexpected-clone/2 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:873: activate_osd: echo start osd.2 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr:start osd.2 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:874: activate_osd: ceph-osd -i 2 --fsid=e67033dd-cb94-40ec-b150-37e90e38e2d0 --auth-supported=none --mon-host=127.0.0.1:7144 --osd-failsafe-full-ratio=.99 --osd-journal-size=100 --osd-scrub-load-threshold=2000 --osd-data=td/osd-unexpected-clone/2 --osd-journal=td/osd-unexpected-clone/2/journal --chdir= --run-dir=td/osd-unexpected-clone '--admin-socket=/tmp/ceph-asok.601329/$cluster-$name.asok' --debug-osd=20 '--log-file=td/osd-unexpected-clone/$name.log' '--pid-file=td/osd-unexpected-clone/$name.pid' --osd-max-object-name-len=460 --osd-max-object-namespace-len=64 '--enable-experimental-unrecoverable-data-corrupting-features=*' --osd-mclock-profile=high_recovery_ops 2026-03-08T23:51:32.473 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: cat td/osd-unexpected-clone/2/whoami 2026-03-08T23:51:32.474 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:876: activate_osd: '[' 2 = 2 ']' 2026-03-08T23:51:32.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: grep -q '"noup"' 2026-03-08T23:51:32.475 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: jq '.flags_set[]' 2026-03-08T23:51:32.484 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:879: activate_osd: ceph osd dump --format=json 2026-03-08T23:51:32.490 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:32.497+0000 7f6c75a118c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:32.491 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:32.497+0000 7f6c75a118c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:32.493 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:32.501+0000 7f6c75a118c0 -1 WARNING: all dangerous and experimental features are enabled. 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:882: activate_osd: wait_for_osd up 2 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:978: wait_for_osd: local state=up 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:979: wait_for_osd: local id=2 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:981: wait_for_osd: status=1 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i=0 )) 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 0 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:0 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:32.658 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:32.833 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:33.449 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:33.457+0000 7f6c75a118c0 -1 Falling back to public interface 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 1 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:33.835 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:34.018 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:34.436 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:34.441+0000 7f6c75a118c0 -1 osd.2 41 log_to_monitors true 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 2 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:2 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:35.019 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:35.200 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:985: wait_for_osd: sleep 1 2026-03-08T23:51:35.603 INFO:tasks.workunit.client.0.vm03.stderr:2026-03-08T23:51:35.609+0000 7f6c6c9c1640 -1 osd.2 41 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i++ )) 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:982: wait_for_osd: (( i < 300 )) 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:983: wait_for_osd: echo 3 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:3 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: ceph osd dump 2026-03-08T23:51:36.202 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:984: wait_for_osd: grep 'osd.2 up' 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:osd.2 up in weight 1 up_from 45 up_thru 0 down_at 42 last_clean_interval [40,41) [v2:127.0.0.1:6818/3346090298,v1:127.0.0.1:6819/3346090298] [v2:127.0.0.1:6820/3346090298,v1:127.0.0.1:6821/3346090298] exists,up 5b7f726b-4b65-4cac-9807-0f70247440da 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:987: wait_for_osd: status=0 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:988: wait_for_osd: break 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:991: wait_for_osd: return 0 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1303: objectstore_tool: wait_for_clean 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1656: wait_for_clean: local cmd= 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1657: wait_for_clean: local num_active_clean=-1 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1658: wait_for_clean: local cur_active_clean 2026-03-08T23:51:36.375 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: get_timeout_delays 90 .1 2026-03-08T23:51:36.376 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: shopt -q -o xtrace 2026-03-08T23:51:36.376 INFO:tasks.workunit.client.0.vm03.stderr:///home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: echo true 2026-03-08T23:51:36.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1602: get_timeout_delays: local trace=true 2026-03-08T23:51:36.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: true 2026-03-08T23:51:36.376 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1603: get_timeout_delays: shopt -u -o xtrace 2026-03-08T23:51:36.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: delays=('0.1' '0.2' '0.4' '0.8' '1.6' '3.2' '6.4' '12.8' '15' '15' '15' '15' '4.5') 2026-03-08T23:51:36.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1659: wait_for_clean: local -a delays 2026-03-08T23:51:36.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1660: wait_for_clean: local -i loop=0 2026-03-08T23:51:36.440 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1662: wait_for_clean: flush_pg_stats 2026-03-08T23:51:36.441 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2260: flush_pg_stats: local timeout=300 2026-03-08T23:51:36.441 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ceph osd ls 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2262: flush_pg_stats: ids='0 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr:1 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr:2' 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2263: flush_pg_stats: seqs= 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:36.617 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.0 flush_pg_stats 2026-03-08T23:51:36.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=21474836499 2026-03-08T23:51:36.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 21474836499 2026-03-08T23:51:36.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836499' 2026-03-08T23:51:36.704 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:36.704 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.1 flush_pg_stats 2026-03-08T23:51:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=120259084299 2026-03-08T23:51:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 120259084299 2026-03-08T23:51:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836499 1-120259084299' 2026-03-08T23:51:36.788 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2264: flush_pg_stats: for osd in $ids 2026-03-08T23:51:36.788 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: ceph tell osd.2 flush_pg_stats 2026-03-08T23:51:36.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2265: flush_pg_stats: seq=193273528322 2026-03-08T23:51:36.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2266: flush_pg_stats: test -z 193273528322 2026-03-08T23:51:36.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2270: flush_pg_stats: seqs=' 0-21474836499 1-120259084299 2-193273528322' 2026-03-08T23:51:36.879 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:36.879 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 0-21474836499 2026-03-08T23:51:36.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:36.880 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=0 2026-03-08T23:51:36.880 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 0-21474836499 2026-03-08T23:51:36.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:36.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=21474836499 2026-03-08T23:51:36.881 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.0 seq 21474836499' 2026-03-08T23:51:36.881 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.0 seq 21474836499 2026-03-08T23:51:36.881 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:37.055 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836497 -lt 21474836499 2026-03-08T23:51:37.056 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2278: flush_pg_stats: sleep 1 2026-03-08T23:51:38.057 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2279: flush_pg_stats: '[' 300 -eq 0 ']' 2026-03-08T23:51:38.057 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 0 2026-03-08T23:51:38.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 21474836499 -lt 21474836499 2026-03-08T23:51:38.236 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:38.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 1-120259084299 2026-03-08T23:51:38.236 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:38.237 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=1 2026-03-08T23:51:38.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 1-120259084299 2026-03-08T23:51:38.237 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:38.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=120259084299 2026-03-08T23:51:38.238 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.1 seq 120259084299' 2026-03-08T23:51:38.238 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.1 seq 120259084299 2026-03-08T23:51:38.238 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 1 2026-03-08T23:51:38.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 120259084299 -lt 120259084299 2026-03-08T23:51:38.405 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2273: flush_pg_stats: for s in $seqs 2026-03-08T23:51:38.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: echo 2-193273528322 2026-03-08T23:51:38.405 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: cut -d - -f 1 2026-03-08T23:51:38.406 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2274: flush_pg_stats: osd=2 2026-03-08T23:51:38.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: echo 2-193273528322 2026-03-08T23:51:38.406 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: cut -d - -f 2 2026-03-08T23:51:38.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2275: flush_pg_stats: seq=193273528322 2026-03-08T23:51:38.407 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2276: flush_pg_stats: echo 'waiting osd.2 seq 193273528322' 2026-03-08T23:51:38.407 INFO:tasks.workunit.client.0.vm03.stderr:waiting osd.2 seq 193273528322 2026-03-08T23:51:38.407 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: ceph osd last-stat-seq 2 2026-03-08T23:51:38.594 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2277: flush_pg_stats: test 193273528322 -lt 193273528322 2026-03-08T23:51:38.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: get_num_pgs 2026-03-08T23:51:38.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:38.594 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1663: wait_for_clean: test 1 == 0 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1667: wait_for_clean: true 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: get_num_active_clean 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1364: get_num_active_clean: local expression 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1365: get_num_active_clean: expression+='select(contains("active") and contains("clean")) | ' 2026-03-08T23:51:38.800 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1366: get_num_active_clean: expression+='select(contains("stale") | not)' 2026-03-08T23:51:38.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1367: get_num_active_clean: ceph --format json pg dump pgs 2026-03-08T23:51:38.801 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1368: get_num_active_clean: jq '.pg_stats | [.[] | .state | select(contains("active") and contains("clean")) | select(contains("stale") | not)] | length' 2026-03-08T23:51:38.987 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1671: wait_for_clean: cur_active_clean=1 2026-03-08T23:51:38.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: get_num_pgs 2026-03-08T23:51:38.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: ceph --format json status 2026-03-08T23:51:38.987 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1425: get_num_pgs: jq .pgmap.num_pgs 2026-03-08T23:51:39.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: test 1 = 1 2026-03-08T23:51:39.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1672: wait_for_clean: break 2026-03-08T23:51:39.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:1687: wait_for_clean: return 0 2026-03-08T23:51:39.208 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:70: TEST_recover_unexpected: sleep 5 2026-03-08T23:51:44.209 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:72: TEST_recover_unexpected: ceph pg repair 1.0 2026-03-08T23:51:44.365 INFO:tasks.workunit.client.0.vm03.stderr:instructing pg 1.0 on osd.1 to repair 2026-03-08T23:51:44.377 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:74: TEST_recover_unexpected: sleep 10 2026-03-08T23:51:54.379 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:76: TEST_recover_unexpected: ceph log last 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.502386+0000 osd.1 (osd.1) 4 : cluster [ERR] 1.0 shard 1 1:602f83fe:::foo:1 : missing 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.502491+0000 osd.1 (osd.1) 5 : cluster [ERR] repair 1.0 1:602f83fe:::foo:1 : is an unexpected clone 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.502551+0000 osd.1 (osd.1) 6 : cluster [ERR] 1.0 repair 1 missing, 0 inconsistent objects 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.502557+0000 osd.1 (osd.1) 7 : cluster [ERR] 1.0 repair 1 missing, 0 inconsistent objects 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.502579+0000 osd.1 (osd.1) 8 : cluster [ERR] 1.0 repair 3 errors, 2 fixed 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.507766+0000 osd.1 (osd.1) 10 : cluster [ERR] 1.0 shard 1 soid 1:602f83fe:::foo:1 : candidate size 0 info size 2163 mismatch 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.507796+0000 osd.1 (osd.1) 11 : cluster [ERR] 1.0 shard 0 soid 1:602f83fe:::foo:1 : data_digest 0x6c50065a != data_digest 0x6787e539 from auth oi 1:602f83fe:::foo:1(19'2 client.4162.0:1 dirty|data_digest s 2163 uv 1 dd 6787e539 alloc_hint [0 0 0]) 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.507797+0000 osd.1 (osd.1) 12 : cluster [ERR] 1.0 shard 1 soid 1:602f83fe:::foo:1 : data_digest 0xffffffff != data_digest 0x6c50065a from shard 0, data_digest 0xffffffff != data_digest 0x6787e539 from auth oi 1:602f83fe:::foo:1(19'2 client.4162.0:1 dirty|data_digest s 2163 uv 1 dd 6787e539 alloc_hint [0 0 0]), size 0 != size 2163 from auth oi 1:602f83fe:::foo:1(19'2 client.4162.0:1 dirty|data_digest s 2163 uv 1 dd 6787e539 alloc_hint [0 0 0]), size 0 != size 2163 from shard 0 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.507797+0000 osd.1 (osd.1) 13 : cluster [ERR] 1.0 soid 1:602f83fe:::foo:1 : data_digest 0x6787e539 != data_digest 0x6c50065a from shard 0 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.507836+0000 osd.1 (osd.1) 14 : cluster [ERR] deep-scrub 1.0 1:602f83fe:::foo:1 : is an unexpected clone 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.508006+0000 osd.1 (osd.1) 15 : cluster [ERR] 1.0 deep-scrub 0 missing, 1 inconsistent objects 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:44.508007+0000 osd.1 (osd.1) 16 : cluster [ERR] 1.0 deep-scrub 4 errors 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:47.503444+0000 mon.a (mon.0) 252 : cluster [ERR] Health check failed: 4 scrub errors (OSD_SCRUB_ERRORS) 2026-03-08T23:51:54.563 INFO:tasks.workunit.client.0.vm03.stdout:2026-03-08T23:51:47.503461+0000 mon.a (mon.0) 253 : cluster [ERR] Health check failed: Possible data damage: 1 pg inconsistent (PG_DAMAGED) 2026-03-08T23:51:54.575 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:79: TEST_recover_unexpected: timeout 60 ceph tell osd.0 version 2026-03-08T23:51:54.647 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:51:54.647 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T23:51:54.647 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T23:51:54.647 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T23:51:54.647 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:51:54.657 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:80: TEST_recover_unexpected: timeout 60 ceph tell osd.1 version 2026-03-08T23:51:54.728 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:51:54.728 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T23:51:54.728 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T23:51:54.728 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T23:51:54.728 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:51:54.738 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:81: TEST_recover_unexpected: timeout 60 ceph tell osd.2 version 2026-03-08T23:51:54.807 INFO:tasks.workunit.client.0.vm03.stdout:{ 2026-03-08T23:51:54.808 INFO:tasks.workunit.client.0.vm03.stdout: "version": "19.2.3-678-ge911bdeb", 2026-03-08T23:51:54.808 INFO:tasks.workunit.client.0.vm03.stdout: "release": "squid", 2026-03-08T23:51:54.808 INFO:tasks.workunit.client.0.vm03.stdout: "release_type": "stable" 2026-03-08T23:51:54.808 INFO:tasks.workunit.client.0.vm03.stdout:} 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/scrub/osd-unexpected-clone.sh:35: run: teardown td/osd-unexpected-clone 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs= 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-unexpected-clone KILL 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:54.818 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:54.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:54.819 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:54.933 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:54.933 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:51:54.934 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:51:54.934 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:51:54.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:51:54.935 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:51:54.935 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:51:54.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:51:54.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:51:54.936 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:51:54.936 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:51:54.937 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:51:54.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o '' = 1 ']' 2026-03-08T23:51:54.938 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-unexpected-clone 2026-03-08T23:51:54.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:51:54.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:54.953 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:54.953 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.601329 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2377: main: code=0 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2381: main: teardown td/osd-unexpected-clone 0 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:164: teardown: local dir=td/osd-unexpected-clone 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:165: teardown: local dumplogs=0 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:166: teardown: kill_daemons td/osd-unexpected-clone KILL 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: shopt -q -o xtrace 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: echo true 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:345: kill_daemons: local trace=true 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: true 2026-03-08T23:51:54.954 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:346: kill_daemons: shopt -u -o xtrace 2026-03-08T23:51:54.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:362: kill_daemons: return 0 2026-03-08T23:51:54.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: uname 2026-03-08T23:51:54.956 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:167: teardown: '[' Linux '!=' FreeBSD ']' 2026-03-08T23:51:54.956 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: stat -f -c %T . 2026-03-08T23:51:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:168: teardown: '[' ext2/ext3 == btrfs ']' 2026-03-08T23:51:54.957 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:171: teardown: local cores=no 2026-03-08T23:51:54.957 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: sysctl -n kernel.core_pattern 2026-03-08T23:51:54.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:172: teardown: local pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:51:54.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:174: teardown: '[' / = '|' ']' 2026-03-08T23:51:54.958 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: grep -q '^core\|core$' 2026-03-08T23:51:54.958 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: dirname /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-08T23:51:54.959 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:180: teardown: ls /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:51:54.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:189: teardown: '[' no = yes -o 0 = 1 ']' 2026-03-08T23:51:54.960 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:198: teardown: rm -fr td/osd-unexpected-clone 2026-03-08T23:51:54.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: get_asok_dir 2026-03-08T23:51:54.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:108: get_asok_dir: '[' -n '' ']' 2026-03-08T23:51:54.961 INFO:tasks.workunit.client.0.vm03.stderr://home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:111: get_asok_dir: echo /tmp/ceph-asok.601329 2026-03-08T23:51:54.961 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:199: teardown: rm -rf /tmp/ceph-asok.601329 2026-03-08T23:51:54.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:200: teardown: '[' no = yes ']' 2026-03-08T23:51:54.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:207: teardown: return 0 2026-03-08T23:51:54.962 INFO:tasks.workunit.client.0.vm03.stderr:/home/ubuntu/cephtest/clone.client.0/qa/standalone/ceph-helpers.sh:2382: main: return 0 2026-03-08T23:51:54.962 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-08T23:51:54.962 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-08T23:51:55.011 INFO:tasks.workunit:Stopping ['scrub'] on client.0... 2026-03-08T23:51:55.011 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-08T23:51:55.471 DEBUG:teuthology.parallel:result is None 2026-03-08T23:51:55.471 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:51:55.480 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:51:55.480 DEBUG:teuthology.orchestra.run.vm03:> rmdir -- /home/ubuntu/cephtest/mnt.0 2026-03-08T23:51:55.528 INFO:tasks.workunit:Deleted artificial mount point /home/ubuntu/cephtest/mnt.0/client.0 2026-03-08T23:51:55.528 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-08T23:51:55.530 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-08T23:51:55.530 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-08T23:51:55.586 INFO:teuthology.task.install.deb:Removing packages: ceph, cephadm, ceph-mds, ceph-mgr, ceph-common, ceph-fuse, ceph-test, ceph-volume, radosgw, python3-rados, python3-rgw, python3-cephfs, python3-rbd, libcephfs2, libcephfs-dev, librados2, librbd1, rbd-fuse on Debian system. 2026-03-08T23:51:55.586 DEBUG:teuthology.orchestra.run.vm03:> for d in ceph cephadm ceph-mds ceph-mgr ceph-common ceph-fuse ceph-test ceph-volume radosgw python3-rados python3-rgw python3-cephfs python3-rbd libcephfs2 libcephfs-dev librados2 librbd1 rbd-fuse ; do sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" purge $d || true ; done 2026-03-08T23:51:55.656 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:51:55.862 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:51:55.863 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:51:56.048 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:51:56.048 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T23:51:56.049 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 sg3-utils sg3-utils-udev 2026-03-08T23:51:56.049 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:51:56.077 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:51:56.078 INFO:teuthology.orchestra.run.vm03.stdout: ceph* 2026-03-08T23:51:56.272 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T23:51:56.272 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 47.1 kB disk space will be freed. 2026-03-08T23:51:56.315 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118605 files and directories currently installed.) 2026-03-08T23:51:56.317 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:51:57.510 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:51:57.545 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:51:57.753 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:51:57.754 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:51:57.980 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:51:57.981 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T23:51:57.981 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T23:51:57.981 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:51:58.000 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:51:58.001 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-cephadm* cephadm* 2026-03-08T23:51:58.196 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T23:51:58.197 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 1775 kB disk space will be freed. 2026-03-08T23:51:58.241 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118603 files and directories currently installed.) 2026-03-08T23:51:58.244 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:51:58.262 INFO:teuthology.orchestra.run.vm03.stdout:Removing cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:51:58.293 INFO:teuthology.orchestra.run.vm03.stdout:Looking for files to backup/remove ... 2026-03-08T23:51:58.294 INFO:teuthology.orchestra.run.vm03.stdout:Not backing up/removing `/var/lib/cephadm', it matches ^/var/.*. 2026-03-08T23:51:58.296 INFO:teuthology.orchestra.run.vm03.stdout:Removing user `cephadm' ... 2026-03-08T23:51:58.296 INFO:teuthology.orchestra.run.vm03.stdout:Warning: group `nogroup' has no more members. 2026-03-08T23:51:58.308 INFO:teuthology.orchestra.run.vm03.stdout:Done. 2026-03-08T23:51:58.331 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:51:58.471 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T23:51:58.473 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for cephadm (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:51:59.653 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:51:59.689 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:51:59.897 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:51:59.898 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:00.081 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:00.081 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mon kpartx libboost-iostreams1.74.0 libboost-thread1.74.0 libpmemobj1 2026-03-08T23:52:00.081 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 python-asyncssh-doc python3-asyncssh sg3-utils sg3-utils-udev 2026-03-08T23:52:00.081 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:00.094 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:00.095 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mds* 2026-03-08T23:52:00.272 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T23:52:00.272 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 7437 kB disk space will be freed. 2026-03-08T23:52:00.310 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118529 files and directories currently installed.) 2026-03-08T23:52:00.313 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:00.746 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:52:00.853 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T23:52:00.855 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mds (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:02.473 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:02.510 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:02.736 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:02.737 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:02.970 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:02.970 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core ceph-mon kpartx libboost-iostreams1.74.0 2026-03-08T23:52:02.970 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libpmemobj1 libsgutils2-2 python-asyncssh-doc 2026-03-08T23:52:02.970 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools python3-cheroot 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-psutil python3-pyinotify 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-repoze.lru python3-requests-oauthlib python3-routes python3-rsa 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplegeneric python3-simplejson python3-singledispatch 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn python3-sklearn-lib python3-tempita python3-tempora 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-threadpoolctl python3-waitress python3-webob python3-websocket 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev 2026-03-08T23:52:02.971 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:02.978 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:02.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr* ceph-mgr-dashboard* ceph-mgr-diskprediction-local* 2026-03-08T23:52:02.978 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-k8sevents* 2026-03-08T23:52:03.164 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 4 to remove and 10 not upgraded. 2026-03-08T23:52:03.164 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 165 MB disk space will be freed. 2026-03-08T23:52:03.208 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 118521 files and directories currently installed.) 2026-03-08T23:52:03.210 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-k8sevents (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:03.223 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-diskprediction-local (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:03.251 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-dashboard (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:03.290 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:03.775 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T23:52:03.778 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mgr (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:05.315 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:05.348 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:05.547 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:05.548 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:05.756 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:05.756 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:05.756 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T23:52:05.757 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T23:52:05.758 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:05.774 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:05.775 INFO:teuthology.orchestra.run.vm03.stdout: ceph-base* ceph-common* ceph-mon* ceph-osd* ceph-test* ceph-volume* radosgw* 2026-03-08T23:52:05.962 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T23:52:05.962 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 472 MB disk space will be freed. 2026-03-08T23:52:05.998 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117937 files and directories currently installed.) 2026-03-08T23:52:05.999 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-volume (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:06.059 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:06.520 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:06.954 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:07.416 INFO:teuthology.orchestra.run.vm03.stdout:Removing radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:07.832 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-test (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:07.871 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:08.278 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:52:08.319 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T23:52:08.389 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117455 files and directories currently installed.) 2026-03-08T23:52:08.391 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for radosgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:08.939 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-mon (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:09.348 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-base (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:09.789 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:10.212 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-osd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:11.718 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:11.751 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:11.957 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:11.957 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:12.155 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:12.155 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:12.155 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T23:52:12.156 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:12.173 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:12.175 INFO:teuthology.orchestra.run.vm03.stdout: ceph-fuse* 2026-03-08T23:52:12.357 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T23:52:12.357 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 3673 kB disk space will be freed. 2026-03-08T23:52:12.400 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117443 files and directories currently installed.) 2026-03-08T23:52:12.403 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:12.833 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:52:12.930 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T23:52:12.932 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for ceph-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:14.509 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:14.548 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:14.739 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:14.740 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout:Package 'ceph-test' is not installed, so not removed 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T23:52:14.949 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:14.973 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:14.973 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:15.006 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:15.183 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:15.184 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:15.431 INFO:teuthology.orchestra.run.vm03.stdout:Package 'ceph-volume' is not installed, so not removed 2026-03-08T23:52:15.431 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:15.431 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:15.431 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T23:52:15.432 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T23:52:15.433 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:15.468 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:15.469 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:15.501 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:15.715 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:15.715 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:15.890 INFO:teuthology.orchestra.run.vm03.stdout:Package 'radosgw' is not installed, so not removed 2026-03-08T23:52:15.890 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:15.890 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:15.890 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liboath0 libonig5 libpmemobj1 libradosstriper1 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: libsgutils2-2 libsqlite3-mod-ceph nvme-cli python-asyncssh-doc 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python-pastedeploy-tpl python3-asyncssh python3-cachetools 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-ceph-common python3-cheroot python3-cherrypy3 python3-google-auth 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.classes python3-jaraco.collections python3-jaraco.functools 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.text python3-joblib python3-kubernetes python3-logutils 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-mako python3-natsort python3-paste python3-pastedeploy 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-pastescript python3-pecan python3-portend python3-prettytable 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-psutil python3-pyinotify python3-repoze.lru 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-requests-oauthlib python3-routes python3-rsa python3-simplegeneric 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-simplejson python3-singledispatch python3-sklearn 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-sklearn-lib python3-tempita python3-tempora python3-threadpoolctl 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-waitress python3-wcwidth python3-webob python3-websocket 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: python3-webtest python3-werkzeug python3-zc.lockfile sg3-utils 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout: sg3-utils-udev smartmontools socat xmlstarlet 2026-03-08T23:52:15.891 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:15.908 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:15.909 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:15.941 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:16.155 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:16.155 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:16.383 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:16.383 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:16.383 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:16.383 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:16.384 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:16.399 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:16.399 INFO:teuthology.orchestra.run.vm03.stdout: python3-cephfs* python3-rados* python3-rgw* 2026-03-08T23:52:16.604 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 3 to remove and 10 not upgraded. 2026-03-08T23:52:16.604 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 2062 kB disk space will be freed. 2026-03-08T23:52:16.654 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117434 files and directories currently installed.) 2026-03-08T23:52:16.657 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cephfs (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:16.670 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rgw (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:16.680 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rados (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:17.899 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:17.933 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:18.154 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:18.155 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:18.412 INFO:teuthology.orchestra.run.vm03.stdout:Package 'python3-rgw' is not installed, so not removed 2026-03-08T23:52:18.412 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:18.412 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:18.412 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:18.413 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:18.414 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:18.414 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:18.414 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:18.443 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:18.443 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:18.478 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:18.709 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout:Package 'python3-cephfs' is not installed, so not removed 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:18.878 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:18.879 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:18.900 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:18.901 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:18.933 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:19.143 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:19.143 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:19.328 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:19.328 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:19.328 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:19.328 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:19.329 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:19.330 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:19.330 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:19.345 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:19.345 INFO:teuthology.orchestra.run.vm03.stdout: python3-rbd* 2026-03-08T23:52:19.538 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 1 to remove and 10 not upgraded. 2026-03-08T23:52:19.538 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 1186 kB disk space will be freed. 2026-03-08T23:52:19.571 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117410 files and directories currently installed.) 2026-03-08T23:52:19.572 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rbd (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:20.769 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:20.805 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:21.031 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:21.032 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:21.223 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:21.223 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:21.223 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:21.223 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:21.224 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:21.225 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:21.225 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:21.225 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:21.240 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:21.241 INFO:teuthology.orchestra.run.vm03.stdout: libcephfs-dev* libcephfs2* 2026-03-08T23:52:21.448 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 2 to remove and 10 not upgraded. 2026-03-08T23:52:21.448 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 3202 kB disk space will be freed. 2026-03-08T23:52:21.497 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117402 files and directories currently installed.) 2026-03-08T23:52:21.500 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs-dev (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:21.511 INFO:teuthology.orchestra.run.vm03.stdout:Removing libcephfs2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:21.537 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T23:52:22.780 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:22.815 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:23.026 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:23.027 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:23.217 INFO:teuthology.orchestra.run.vm03.stdout:Package 'libcephfs-dev' is not installed, so not removed 2026-03-08T23:52:23.218 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:23.218 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:23.218 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libjq1 liblua5.3-dev liboath0 libonig5 libpmemobj1 2026-03-08T23:52:23.218 INFO:teuthology.orchestra.run.vm03.stdout: libradosstriper1 librdkafka1 libreadline-dev librgw2 libsgutils2-2 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: libsqlite3-mod-ceph lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: pkg-config python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile sg3-utils sg3-utils-udev smartmontools socat unzip 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout: xmlstarlet zip 2026-03-08T23:52:23.219 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:23.240 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:23.240 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:23.272 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:23.495 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:23.496 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:23.657 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:23.658 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:23.658 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T23:52:23.658 INFO:teuthology.orchestra.run.vm03.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T23:52:23.658 INFO:teuthology.orchestra.run.vm03.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T23:52:23.658 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T23:52:23.659 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:23.674 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:23.675 INFO:teuthology.orchestra.run.vm03.stdout: librados2* libradosstriper1* librbd1* librgw2* libsqlite3-mod-ceph* 2026-03-08T23:52:23.676 INFO:teuthology.orchestra.run.vm03.stdout: qemu-block-extra* rbd-fuse* 2026-03-08T23:52:23.892 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 7 to remove and 10 not upgraded. 2026-03-08T23:52:23.892 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 51.6 MB disk space will be freed. 2026-03-08T23:52:23.939 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117387 files and directories currently installed.) 2026-03-08T23:52:23.942 INFO:teuthology.orchestra.run.vm03.stdout:Removing rbd-fuse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:23.954 INFO:teuthology.orchestra.run.vm03.stdout:Removing libsqlite3-mod-ceph (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:23.967 INFO:teuthology.orchestra.run.vm03.stdout:Removing libradosstriper1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:23.981 INFO:teuthology.orchestra.run.vm03.stdout:Removing qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T23:52:24.419 INFO:teuthology.orchestra.run.vm03.stdout:Removing librbd1 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:24.432 INFO:teuthology.orchestra.run.vm03.stdout:Removing librgw2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:24.446 INFO:teuthology.orchestra.run.vm03.stdout:Removing librados2 (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:24.475 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:52:24.511 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T23:52:24.590 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T23:52:24.592 INFO:teuthology.orchestra.run.vm03.stdout:Purging configuration files for qemu-block-extra (1:6.2+dfsg-2ubuntu6.28) ... 2026-03-08T23:52:26.091 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:26.126 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:26.350 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:26.350 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:26.575 INFO:teuthology.orchestra.run.vm03.stdout:Package 'librbd1' is not installed, so not removed 2026-03-08T23:52:26.575 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T23:52:26.576 INFO:teuthology.orchestra.run.vm03.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T23:52:26.577 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:26.607 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:26.607 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:26.641 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:26.870 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:26.871 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:27.050 INFO:teuthology.orchestra.run.vm03.stdout:Package 'rbd-fuse' is not installed, so not removed 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout:The following packages were automatically installed and are no longer required: 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:27.051 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T23:52:27.052 INFO:teuthology.orchestra.run.vm03.stdout:Use 'sudo apt autoremove' to remove them. 2026-03-08T23:52:27.072 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 0 to remove and 10 not upgraded. 2026-03-08T23:52:27.072 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:27.074 DEBUG:teuthology.orchestra.run.vm03:> dpkg -l | grep '^.\(U\|H\)R' | awk '{print $2}' | sudo xargs --no-run-if-empty dpkg -P --force-remove-reinstreq 2026-03-08T23:52:27.129 DEBUG:teuthology.orchestra.run.vm03:> sudo DEBIAN_FRONTEND=noninteractive apt-get -y --force-yes -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" autoremove 2026-03-08T23:52:27.208 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:27.427 INFO:teuthology.orchestra.run.vm03.stdout:Building dependency tree... 2026-03-08T23:52:27.428 INFO:teuthology.orchestra.run.vm03.stdout:Reading state information... 2026-03-08T23:52:27.629 INFO:teuthology.orchestra.run.vm03.stdout:The following packages will be REMOVED: 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: ceph-mgr-modules-core jq kpartx libboost-iostreams1.74.0 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: libboost-thread1.74.0 libdouble-conversion3 libfuse2 libgfapi0 libgfrpc0 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: libgfxdr0 libglusterfs0 libiscsi7 libjq1 liblttng-ust1 liblua5.3-dev libnbd0 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: liboath0 libonig5 libpcre2-16-0 libpmemobj1 libqt5core5a libqt5dbus5 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: libqt5network5 librdkafka1 libreadline-dev libsgutils2-2 libthrift-0.16.0 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: lua-any lua-sec lua-socket lua5.1 luarocks nvme-cli pkg-config 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python-asyncssh-doc python-pastedeploy-tpl python3-asyncssh 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python3-cachetools python3-ceph-argparse python3-ceph-common python3-cheroot 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python3-cherrypy3 python3-google-auth python3-jaraco.classes 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python3-jaraco.collections python3-jaraco.functools python3-jaraco.text 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python3-joblib python3-kubernetes python3-logutils python3-mako 2026-03-08T23:52:27.630 INFO:teuthology.orchestra.run.vm03.stdout: python3-natsort python3-paste python3-pastedeploy python3-pastescript 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-pecan python3-portend python3-prettytable python3-psutil 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-pyinotify python3-repoze.lru python3-requests-oauthlib 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-routes python3-rsa python3-simplegeneric python3-simplejson 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-singledispatch python3-sklearn python3-sklearn-lib python3-tempita 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-tempora python3-threadpoolctl python3-waitress python3-wcwidth 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-webob python3-websocket python3-webtest python3-werkzeug 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: python3-zc.lockfile qttranslations5-l10n sg3-utils sg3-utils-udev 2026-03-08T23:52:27.631 INFO:teuthology.orchestra.run.vm03.stdout: smartmontools socat unzip xmlstarlet zip 2026-03-08T23:52:27.806 INFO:teuthology.orchestra.run.vm03.stdout:0 upgraded, 0 newly installed, 87 to remove and 10 not upgraded. 2026-03-08T23:52:27.806 INFO:teuthology.orchestra.run.vm03.stdout:After this operation, 107 MB disk space will be freed. 2026-03-08T23:52:27.852 INFO:teuthology.orchestra.run.vm03.stdout:(Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 117336 files and directories currently installed.) 2026-03-08T23:52:27.854 INFO:teuthology.orchestra.run.vm03.stdout:Removing ceph-mgr-modules-core (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:27.869 INFO:teuthology.orchestra.run.vm03.stdout:Removing jq (1.6-2.1ubuntu3.1) ... 2026-03-08T23:52:27.880 INFO:teuthology.orchestra.run.vm03.stdout:Removing kpartx (0.8.8-1ubuntu1.22.04.4) ... 2026-03-08T23:52:27.891 INFO:teuthology.orchestra.run.vm03.stdout:Removing libboost-iostreams1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T23:52:27.902 INFO:teuthology.orchestra.run.vm03.stdout:Removing libboost-thread1.74.0:amd64 (1.74.0-14ubuntu3) ... 2026-03-08T23:52:27.913 INFO:teuthology.orchestra.run.vm03.stdout:Removing libthrift-0.16.0:amd64 (0.16.0-2) ... 2026-03-08T23:52:27.925 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5network5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T23:52:27.936 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5dbus5:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T23:52:27.946 INFO:teuthology.orchestra.run.vm03.stdout:Removing libqt5core5a:amd64 (5.15.3+dfsg-2ubuntu0.2) ... 2026-03-08T23:52:27.967 INFO:teuthology.orchestra.run.vm03.stdout:Removing libdouble-conversion3:amd64 (3.1.7-4) ... 2026-03-08T23:52:27.978 INFO:teuthology.orchestra.run.vm03.stdout:Removing libfuse2:amd64 (2.9.9-5ubuntu3) ... 2026-03-08T23:52:27.989 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfapi0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T23:52:28.000 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfrpc0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T23:52:28.010 INFO:teuthology.orchestra.run.vm03.stdout:Removing libgfxdr0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T23:52:28.020 INFO:teuthology.orchestra.run.vm03.stdout:Removing libglusterfs0:amd64 (10.1-1ubuntu0.2) ... 2026-03-08T23:52:28.031 INFO:teuthology.orchestra.run.vm03.stdout:Removing libiscsi7:amd64 (1.19.0-3build2) ... 2026-03-08T23:52:28.040 INFO:teuthology.orchestra.run.vm03.stdout:Removing libjq1:amd64 (1.6-2.1ubuntu3.1) ... 2026-03-08T23:52:28.050 INFO:teuthology.orchestra.run.vm03.stdout:Removing liblttng-ust1:amd64 (2.13.1-1ubuntu1) ... 2026-03-08T23:52:28.061 INFO:teuthology.orchestra.run.vm03.stdout:Removing luarocks (3.8.0+dfsg1-1) ... 2026-03-08T23:52:28.085 INFO:teuthology.orchestra.run.vm03.stdout:Removing liblua5.3-dev:amd64 (5.3.6-1build1) ... 2026-03-08T23:52:28.096 INFO:teuthology.orchestra.run.vm03.stdout:Removing libnbd0 (1.10.5-1) ... 2026-03-08T23:52:28.107 INFO:teuthology.orchestra.run.vm03.stdout:Removing liboath0:amd64 (2.6.7-3ubuntu0.1) ... 2026-03-08T23:52:28.116 INFO:teuthology.orchestra.run.vm03.stdout:Removing libonig5:amd64 (6.9.7.1-2build1) ... 2026-03-08T23:52:28.127 INFO:teuthology.orchestra.run.vm03.stdout:Removing libpcre2-16-0:amd64 (10.39-3ubuntu0.1) ... 2026-03-08T23:52:28.138 INFO:teuthology.orchestra.run.vm03.stdout:Removing libpmemobj1:amd64 (1.11.1-3build1) ... 2026-03-08T23:52:28.148 INFO:teuthology.orchestra.run.vm03.stdout:Removing librdkafka1:amd64 (1.8.0-1build1) ... 2026-03-08T23:52:28.160 INFO:teuthology.orchestra.run.vm03.stdout:Removing libreadline-dev:amd64 (8.1.2-1) ... 2026-03-08T23:52:28.171 INFO:teuthology.orchestra.run.vm03.stdout:Removing sg3-utils-udev (1.46-1ubuntu0.22.04.1) ... 2026-03-08T23:52:28.179 INFO:teuthology.orchestra.run.vm03.stdout:update-initramfs: deferring update (trigger activated) 2026-03-08T23:52:28.188 INFO:teuthology.orchestra.run.vm03.stdout:Removing sg3-utils (1.46-1ubuntu0.22.04.1) ... 2026-03-08T23:52:28.205 INFO:teuthology.orchestra.run.vm03.stdout:Removing libsgutils2-2:amd64 (1.46-1ubuntu0.22.04.1) ... 2026-03-08T23:52:28.216 INFO:teuthology.orchestra.run.vm03.stdout:Removing lua-any (27ubuntu1) ... 2026-03-08T23:52:28.225 INFO:teuthology.orchestra.run.vm03.stdout:Removing lua-sec:amd64 (1.0.2-1) ... 2026-03-08T23:52:28.236 INFO:teuthology.orchestra.run.vm03.stdout:Removing lua-socket:amd64 (3.0~rc1+git+ac3201d-6) ... 2026-03-08T23:52:28.250 INFO:teuthology.orchestra.run.vm03.stdout:Removing lua5.1 (5.1.5-8.1build4) ... 2026-03-08T23:52:28.267 INFO:teuthology.orchestra.run.vm03.stdout:Removing nvme-cli (1.16-3ubuntu0.3) ... 2026-03-08T23:52:28.635 INFO:teuthology.orchestra.run.vm03.stdout:Removing pkg-config (0.29.2-1ubuntu3) ... 2026-03-08T23:52:28.708 INFO:teuthology.orchestra.run.vm03.stdout:Removing python-asyncssh-doc (2.5.0-1ubuntu0.1) ... 2026-03-08T23:52:28.734 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-pecan (1.3.3-4ubuntu2) ... 2026-03-08T23:52:28.791 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-webtest (2.0.35-1) ... 2026-03-08T23:52:28.845 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-pastescript (2.0.2-4) ... 2026-03-08T23:52:28.899 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-pastedeploy (2.1.1-1) ... 2026-03-08T23:52:28.954 INFO:teuthology.orchestra.run.vm03.stdout:Removing python-pastedeploy-tpl (2.1.1-1) ... 2026-03-08T23:52:28.966 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-asyncssh (2.5.0-1ubuntu0.1) ... 2026-03-08T23:52:29.024 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-kubernetes (12.0.1-1ubuntu1) ... 2026-03-08T23:52:29.285 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-google-auth (1.5.1-3) ... 2026-03-08T23:52:29.340 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cachetools (5.0.0-1) ... 2026-03-08T23:52:29.388 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-ceph-argparse (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:29.435 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-ceph-common (19.2.3-678-ge911bdeb-1jammy) ... 2026-03-08T23:52:29.484 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cherrypy3 (18.6.1-4) ... 2026-03-08T23:52:29.547 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-cheroot (8.5.2+ds1-1ubuntu3.1) ... 2026-03-08T23:52:29.599 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.collections (3.4.0-2) ... 2026-03-08T23:52:29.648 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.classes (3.2.1-3) ... 2026-03-08T23:52:29.697 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-portend (3.0.0-1) ... 2026-03-08T23:52:29.750 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-tempora (4.1.2-1) ... 2026-03-08T23:52:29.801 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.text (3.6.0-2) ... 2026-03-08T23:52:29.853 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-jaraco.functools (3.4.0-2) ... 2026-03-08T23:52:29.902 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-sklearn (0.23.2-5ubuntu6) ... 2026-03-08T23:52:30.029 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-joblib (0.17.0-4ubuntu1) ... 2026-03-08T23:52:30.092 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-logutils (0.3.3-8) ... 2026-03-08T23:52:30.150 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-mako (1.1.3+ds1-2ubuntu0.1) ... 2026-03-08T23:52:30.204 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-natsort (8.0.2-1) ... 2026-03-08T23:52:30.261 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-paste (3.5.0+dfsg1-1) ... 2026-03-08T23:52:30.327 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-prettytable (2.5.0-2) ... 2026-03-08T23:52:30.385 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-psutil (5.9.0-1build1) ... 2026-03-08T23:52:30.454 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-pyinotify (0.9.6-1.3) ... 2026-03-08T23:52:30.509 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-routes (2.5.1-1ubuntu1) ... 2026-03-08T23:52:30.571 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-repoze.lru (0.7-2) ... 2026-03-08T23:52:30.623 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-requests-oauthlib (1.3.0+ds-0.1) ... 2026-03-08T23:52:30.674 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-rsa (4.8-1) ... 2026-03-08T23:52:30.728 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-simplegeneric (0.8.1-3) ... 2026-03-08T23:52:30.781 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-simplejson (3.17.6-1build1) ... 2026-03-08T23:52:30.837 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-singledispatch (3.4.0.3-3) ... 2026-03-08T23:52:30.887 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-sklearn-lib:amd64 (0.23.2-5ubuntu6) ... 2026-03-08T23:52:30.913 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-tempita (0.5.2-6ubuntu1) ... 2026-03-08T23:52:30.963 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-threadpoolctl (3.1.0-1) ... 2026-03-08T23:52:31.019 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-waitress (1.4.4-1.1ubuntu1.1) ... 2026-03-08T23:52:31.069 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-wcwidth (0.2.5+dfsg1-1) ... 2026-03-08T23:52:31.115 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-webob (1:1.8.6-1.1ubuntu0.1) ... 2026-03-08T23:52:31.164 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-websocket (1.2.3-1) ... 2026-03-08T23:52:31.218 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-werkzeug (2.0.2+dfsg1-1ubuntu0.22.04.3) ... 2026-03-08T23:52:31.271 INFO:teuthology.orchestra.run.vm03.stdout:Removing python3-zc.lockfile (2.0-1) ... 2026-03-08T23:52:31.319 INFO:teuthology.orchestra.run.vm03.stdout:Removing qttranslations5-l10n (5.15.3-1) ... 2026-03-08T23:52:31.340 INFO:teuthology.orchestra.run.vm03.stdout:Removing smartmontools (7.2-1ubuntu0.1) ... 2026-03-08T23:52:31.734 INFO:teuthology.orchestra.run.vm03.stdout:Removing socat (1.7.4.1-3ubuntu4) ... 2026-03-08T23:52:31.744 INFO:teuthology.orchestra.run.vm03.stdout:Removing unzip (6.0-26ubuntu3.2) ... 2026-03-08T23:52:31.763 INFO:teuthology.orchestra.run.vm03.stdout:Removing xmlstarlet (1.6.1-2.1) ... 2026-03-08T23:52:31.779 INFO:teuthology.orchestra.run.vm03.stdout:Removing zip (3.0-12build2) ... 2026-03-08T23:52:31.806 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for libc-bin (2.35-0ubuntu3.13) ... 2026-03-08T23:52:31.831 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for man-db (2.10.2-1) ... 2026-03-08T23:52:31.876 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for mailcap (3.70+nmu1ubuntu1) ... 2026-03-08T23:52:31.884 INFO:teuthology.orchestra.run.vm03.stdout:Processing triggers for initramfs-tools (0.140ubuntu13.5) ... 2026-03-08T23:52:31.900 INFO:teuthology.orchestra.run.vm03.stdout:update-initramfs: Generating /boot/initrd.img-5.15.0-1092-kvm 2026-03-08T23:52:33.459 INFO:teuthology.orchestra.run.vm03.stdout:W: mkconf: MD subsystem is not loaded, thus I cannot scan for arrays. 2026-03-08T23:52:33.459 INFO:teuthology.orchestra.run.vm03.stdout:W: mdadm: failed to auto-generate temporary mdadm.conf file. 2026-03-08T23:52:35.498 INFO:teuthology.orchestra.run.vm03.stderr:W: --force-yes is deprecated, use one of the options starting with --allow instead. 2026-03-08T23:52:35.500 DEBUG:teuthology.parallel:result is None 2026-03-08T23:52:35.501 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm03.local 2026-03-08T23:52:35.501 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f /etc/apt/sources.list.d/ceph.list 2026-03-08T23:52:35.553 DEBUG:teuthology.orchestra.run.vm03:> sudo apt-get update 2026-03-08T23:52:35.733 INFO:teuthology.orchestra.run.vm03.stdout:Hit:1 https://security.ubuntu.com/ubuntu jammy-security InRelease 2026-03-08T23:52:36.228 INFO:teuthology.orchestra.run.vm03.stdout:Hit:2 https://archive.ubuntu.com/ubuntu jammy InRelease 2026-03-08T23:52:36.353 INFO:teuthology.orchestra.run.vm03.stdout:Hit:3 https://archive.ubuntu.com/ubuntu jammy-updates InRelease 2026-03-08T23:52:36.511 INFO:teuthology.orchestra.run.vm03.stdout:Hit:4 https://archive.ubuntu.com/ubuntu jammy-backports InRelease 2026-03-08T23:52:38.435 INFO:teuthology.orchestra.run.vm03.stdout:Reading package lists... 2026-03-08T23:52:38.449 DEBUG:teuthology.parallel:result is None 2026-03-08T23:52:38.449 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-08T23:52:38.451 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-08T23:52:38.451 DEBUG:teuthology.orchestra.run.vm03:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout: remote refid st t when poll reach delay offset jitter 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout:============================================================================== 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout: 0.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout: 1.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout: 2.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T23:52:38.897 INFO:teuthology.orchestra.run.vm03.stdout: 3.ubuntu.pool.n .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout: ntp.ubuntu.com .POOL. 16 p - 64 0 0.000 +0.000 0.000 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:+mail.gunnarhofm 192.53.103.103 2 u 175 256 377 25.060 -11.399 3.183 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-ntp5.kernfusion 237.17.204.95 2 u 163 256 377 28.876 -7.674 1.730 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-server1b.meinbe 131.188.3.222 2 u 150 256 377 23.567 -8.072 1.651 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-ns8.starka.st 79.133.44.139 2 u 214 256 377 22.614 -8.912 1.401 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-static.179.181. 161.62.157.173 3 u 184 256 377 23.590 -8.491 1.387 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-185.252.140.126 218.73.139.35 2 u 149 256 377 25.110 -7.792 1.727 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:#static.81.54.25 131.188.3.222 2 u 147 256 377 25.179 -7.626 1.826 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:-ntp1.uni-ulm.de 129.69.253.1 2 u 187 256 377 27.419 -8.767 1.517 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:*185.125.190.57 194.121.207.249 2 u 243 256 377 31.631 -10.477 2.016 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:+ntp.ntstime.org 131.188.3.222 2 u 199 256 377 28.292 -10.175 1.691 2026-03-08T23:52:38.898 INFO:teuthology.orchestra.run.vm03.stdout:+t1.ipfu.de 193.51.170.61 3 u 155 256 377 28.328 -11.125 1.451 2026-03-08T23:52:38.898 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-08T23:52:38.900 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-08T23:52:38.900 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-08T23:52:38.902 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-08T23:52:38.904 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-08T23:52:38.905 INFO:teuthology.task.internal:Duration was 4299.583330 seconds 2026-03-08T23:52:38.906 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-08T23:52:38.907 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-08T23:52:38.907 DEBUG:teuthology.orchestra.run.vm03:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-08T23:52:38.932 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-08T23:52:38.932 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm03.local 2026-03-08T23:52:38.932 DEBUG:teuthology.orchestra.run.vm03:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-08T23:52:38.986 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-08T23:52:38.986 DEBUG:teuthology.orchestra.run.vm03:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T23:52:39.045 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-08T23:52:39.046 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-08T23:52:39.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-08T23:52:39.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-08T23:52:39.095 INFO:teuthology.orchestra.run.vm03.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-08T23:52:39.095 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-08T23:52:39.095 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-08T23:52:39.098 INFO:teuthology.orchestra.run.vm03.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 85.1% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-08T23:52:39.100 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-08T23:52:39.102 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-08T23:52:39.102 DEBUG:teuthology.orchestra.run.vm03:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-08T23:52:39.151 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-08T23:52:39.153 DEBUG:teuthology.orchestra.run.vm03:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:52:39.198 INFO:teuthology.orchestra.run.vm03.stdout:kernel.core_pattern = core 2026-03-08T23:52:39.206 DEBUG:teuthology.orchestra.run.vm03:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-08T23:52:39.250 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-08T23:52:39.251 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-08T23:52:39.254 INFO:teuthology.task.internal:Transferring archived files... 2026-03-08T23:52:39.254 DEBUG:teuthology.misc:Transferring archived files from vm03:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-08_21:49:43-rados:standalone-squid-none-default-vps/283/remote/vm03 2026-03-08T23:52:39.255 DEBUG:teuthology.orchestra.run.vm03:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-08T23:52:39.301 INFO:teuthology.task.internal:Removing archive directory... 2026-03-08T23:52:39.301 DEBUG:teuthology.orchestra.run.vm03:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-08T23:52:39.346 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-08T23:52:39.349 INFO:teuthology.task.internal:Not uploading archives. 2026-03-08T23:52:39.349 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-08T23:52:39.351 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-08T23:52:39.351 DEBUG:teuthology.orchestra.run.vm03:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-08T23:52:39.391 INFO:teuthology.orchestra.run.vm03.stdout: 258077 4 drwxr-xr-x 2 ubuntu ubuntu 4096 Mar 8 23:52 /home/ubuntu/cephtest 2026-03-08T23:52:39.392 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-08T23:52:39.398 INFO:teuthology.run:Summary data: description: rados:standalone/{supported-random-distro$/{ubuntu_latest} workloads/scrub} duration: 4299.583330154419 flavor: default owner: kyr success: true 2026-03-08T23:52:39.398 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-08T23:52:39.425 INFO:teuthology.run:pass